1. 8

    Good on you. It’s worth mentioning here that Microsoft is going in the other direction. https://www.mercurynews.com/2018/06/19/microsoft-defends-ties-with-ice-amid-separation-outcry/amp/

    1.  

      In response to questions we want to be clear: Microsoft is not working with U.S. Immigration and Customs Enforcement or U.S. Customs and Border Protection on any projects related to separating children from their families at the border, and contrary to some speculation, we are not aware of Azure or Azure services being used for this purpose. As a company, Microsoft is dismayed by the forcible separation of children from their families at the border.

      Maybe I’m missing something, but it seems they are going in the exact same direction…

      1. 5

        It’s a very confusing article; my best guess is that they are working with ICE, but not on “projects related to separating children from their families at the border”.

        1. 8

          And just because Microsoft isn’t directly helping, they are still helping. That nuance is discussed in OP’s article - any support to an morally corrupt institution is unacceptable, even if it is indirect support.

          1.  

            But that perspective is very un-nuanced. Is everything ICE does wrong? It’s a large organization. What if the software the company that @danielcompton denied service to is actually just trying to track down violent offenders that made it across the border? Or drug trafficking?

            To go even further, by your statement, Americans should stop paying their taxes. Are you advocating that?

            1. 12

              ICE is a special case, and deserves to be disbanded. It’s a fairly new agency, and its primary mission is to be a Gestapo. So yes, very explicitly, everything ICE does is wrong.

              1.  

                On what ground and with which argument can you prove your statement? I mean, there is probably an issue with how it’s run, but the whole concept of ICE doesn’t sound that wrong to me.

                1. 6

                  From https://splinternews.com/tear-it-all-down-1826939873 :

                  The thing that is so striking about all three items is not merely the horror they symbolize. It is how easy it was to get all of these people to play their fascistic roles. The Trump administration’s family separation rule has not even been official policy for two months, and yet look at where we are already. The Border Patrol agent is totally unperturbed by the wrenching scenes playing out around him. The officers have sprung to action with a useful lie to ward off desperate parents. Nielsen, whom the New Yorker described in March as “more of an opportunist than an ideologue” and who has been looking to get back into Donald Trump’s good graces, is playing her part—the white supremacist bureaucrat more concerned with office politics than basic morality—with seeming relish. They were all ready.

                  I’m going to just delegate all arguments to that link, basically, with a comment that of it’s not exceedingly obvious, then I probably can’t say anything that would persuade you. Also, this is all extremely off-topic for this forum, but, whatevs.

              2. 8

                There’s always a nuance, sure. Every police force ever subverted for political purposes was still continuing to fight petty crime, prevent murders and help old ladies cross the street. This always presented the regimes a great way to divert criticism, paint critics as crime sympathisers and provide moral leeway to people working there and with them.

                America though, with all its lip service to small government and self reliance was the last place I expected that to see happening. Little did I know!

                1.  

                  Is everything ICE does wrong? It’s a large organization.

                  Just like people, organizations should be praised for their best behaviors and held responsible for their worst behaviors. Also, some organizations wield an incredible amount of power over people and can easily hide wrongdoing and therefore should be held responsible to the strictest standard.

                  1.  

                    Its worth pointing out that ICE didn’t exist 20 years ago. Neither, for that matter did the DHS (I was 22 when that monster was born). “Violent offenders” who “cross the border” will be tracked down by the same people who track down citizen “violent offenders” ie the cops (what does “violent offender” even mean? How do we who these people are? how will we know if they’re sneaking in?) Drug trafficking isn’t part of ICEs institutional prerogative in any large, real sense, so its not for them to worry about? Plenty of americans, for decades, have advocated tax resistance precisely as a means to combat things like this. We can debate its utility but it is absolutely a tactic that has seen use since as far as I know at least the Vietnam war. Not sure how much nuance is necessary when discussing things like this. Doesn’t mean its open season to start dropping outrageous nonsense, but institutions which support/facilitate this in any way should be grounds for at the very least boycotts.

                    1.  

                      Why is it worth pointing out it didn’t exist 20 years ago? Smart phones didn’t either. Everything starts at some time.

                      To separate out arguments, this particular subthread is in response to MSFT helping ICE, but the comment I responded to was referring to the original post, which only refers to “border security”. My comment was really about the broader aspect but I phrased it poorly. In particular, I think the comment I replied to which states that you should not support anything like this indirectly basically means you can’t do anything.

                      1.  

                        Its worth pointing out when it was founded for a lot of reasons; what were the conditions that led to its creation? Were they good? Reasonable? Who created it? What was the mission originally? The date is important because all of these questions become easily accessible to anyone with a web browser and an internet connection, unlike, say, the formation of the FBI or the origins of Jim Crow which while definitely researchable on the net are more domains of historical research. Smart phones and ethnic cleansing however, not so much in the same category.

                        1.  

                          If you believe the circumstances around the formation of ICE are worth considering, I don’t think pointing out the age of the institution is a great way to make that point. It sounds more like you’re saying “new things are inherently bad” rather than “20 years ago was a time with a lot of politically questionable activity” (or something along those lines).

                          1.  

                            dude, read it however you want, but pointing out that ICE is less than 20 years old, when securing a border is a foundational issue, seems like a perfect way to intimate that this is an agency uninterested in actual security and was formed expressly to fulfill a hyper partisan, actually racist agenda. Like, did we not have border security or immigration services or customs enforcement prior to 2002/3? Why then? What was it? Also, given that it was formed so recently, it can be unformed, it can be dismantled that much easier.

                    2.  

                      In addition, I bet the ICE is using Microsoft Windows and probably Office too.

                      1.  

                        That’s a great point, and no I don’t advocate for all Americans to stop paying taxes.

                      2.  

                        any support to an morally corrupt institution is unacceptable, even if it is indirect support

                        A very interesting position. It just requires you to stop using any currency. ;-)

                        1.  

                          No, it requires you to acknowledge that using any currency is unacceptable.

                          Of course not using any currency is also unacceptable. When faced with two unacceptable options, one has to choose one. Using the excuse “If I follow my ethics I can never do anything” is just a lazy way to never think about ethics. In reality everything has to be carefully considered and weighed on a case by case basis.

                          1.  

                            Of course not using any currency is also unacceptable.

                            Why? Currency is just a tool.

                            Using the excuse “If I follow my ethics I can never do anything” is just a lazy way to never think about ethics.

                            I completely agree.
                            Indeed I think that we can always be ethical, but we should look beyond the current “public enemy”, be it Cambridge Analytica or ICE. These are just symptoms. We need to cure the disease.

                1. 5

                  Finished the user end (Swift) and the backend (Common Lisp) of my first ever iOS app. Now looking into enabling remote push notifications via APNs. Unfortunately it’s HTTP/2 & ALPN only, and there’s no Common Lisp library to handle that directly. There’s an H2-14 reference implementation which unfortunately does not support ALPN and have suffered from some serious code rot. So my choices are either fix that up, or get it via some existing library (libcurl?).

                  1. 2

                    So Chromium + Mozilla account for 57MLOC together, including comments and empty lines. Quite tiny compared to the billions lines of legacy Cobol code we hear about!

                    1. 1

                      I doubt there are literally billions of COBOL lines ever written. No, deploying one package to more nodes doesn’t count, sorry.

                      IOW, [citation needed]

                      1. 1

                        Exactly, just that this claim gets thrown so much around tech forums.

                    1. 3

                      Compiled languages definitely get an advantage out of strong typing and concrete API statement, because the planning of resource utilization allows many execution strategies, data layout, and even caching on code hoisting to be exploited. The more the desire to maximally “use” the hardware architecture, the more these grow to “fit” the hardware.

                      At the same time dynamic languages/JITs are getting better to fit the abstract expression of the programmer - functional programming can express compactly/accurately/clearly very elaborate programs, irrespective of the intermediate data types/APIs used in constructing them. The idea is to “fit” the nature of abstractions being manipulated rather than the nature of how they are executed.

                      I’m currently debugging a symbolic configuration mechanism that was prototyped in a week in a dynamic language, but is meant to function in an embedded OS with a very low-level language, as part of a bootstrap. It is taking months to finish, mostly due to adapting the code to work in such a programming environment - you alter the assemblage of primitives to build enough of a virtual machine to handle the semantics of necessary symbol processing. An oddball case, but it’s an example of the two (the virtue of this is that it allows enough “adaptability” at the low-level that you don’t need to drag along the entire dynamic programming environment to serve huge amounts of low-level code that otherwise fits the compiled model perfectly.

                      1. 2

                        Compiled/interpreted and strongly/weakly typed have little to do with each other. Ditto for low/high level: Swift compiles to machine code but good luck maintaining any cache locality with its collections.

                        1. 1

                          Depends on application. And yes we don’t have a good model for cache locality. How much can we get vs complexity to code/maintain?

                        2. 1

                          Can you elaborate on the strengths of the dynamic language that allowed you to prototype it so quickly? The difference in development time stated here is really striking.

                          1. 1

                            Sure. First about the problem - “how do you configure unordered modules while discovering the graph of how they are connected?”. The problem requires multilevel introspection of constructed objects with “temporary” graph assignments in multilevel discovery phase, then successive top down constructor phase with exception feedback.

                            The symbolic “middle layer” to support this was trivial to write in a language like Python using coroutines/iterators, and one could refactor the topological exception handling mechanism to deal with the corner cases quickly, by use of the annotation methods to handle the cases. So the problem didn’t “fight” the implementation.

                            While with the lower level compiled language, too much needed to be rewritten each time to deal with an artifact, so in effect the data types and internal API changed to compensate to fit the low-level model. Also, it was too easy to introduce boundary condition “new” errors each time, while the former’s more compact representation that didn’t thrash so much didn’t have this.

                            Sometimes with low level code, you almost need an expert system to maintain it.

                        1. 10

                          I support workers of Google to unionize. Naver, the #1 tech company in South Korea, recently unionized.

                          1. 3

                            Working in large tech companies can be depressing enough even without compensation determined by union seniority.

                            1. 1

                              This kind of outcome entirely depends on how you unionize.

                              1. 4

                                In theory, sure. In practice tho it’s really hard to find examples of strong unions without formalized payscale.

                                1. 1

                                  FYI, Naver union is one such strong union without formalized payscale.

                              2. -1

                                are you american?

                                1. 2

                                  No. Why?

                              3. 1

                                They would definitely benefit from a union.

                                But they have stronger challenges in this process then most other employees around the west.

                                First the individualistic culture/propaganda they have been fed for years could prevent them from trading the promise of individual benefits for the reality of shared ones.

                                But most relevant is the distribution of the company around the world. There are different legal and economical environments out there and it’s likely that several different unions will compete for Google employees, each focused on the local issues and interests people see most.

                                OTOH Google guys are pretty smart, they could learn how to think globally and how to balance long term benefits with short term local ones more easily than other workers.

                                1. 3

                                  First the individualistic culture/propaganda they have been fed for years could prevent them from trading the promise of individual benefits for the reality of shared ones.

                                  I would say this is widespread in the tech community. This is not just Google or the USA. The same situation is very common here in Europe. That’s why ideologies like libertarianism or identity politics are more easily spread in the tech community.

                                  That said, I believe people like the Tech Worker Alliance are getting it right in the way they communicate: they disguise the unionist message as something else, avoiding “problematic words” and basically working around the prejudice of the average programmer. I believe this is the key for a new unionist movement around the West, because the old one cannot be revived and it’s tainted by its own problems and by years of anti-unionism.

                                  1. 0

                                    First the individualistic culture/propaganda they have been fed for years could prevent them from trading the promise of individual benefits for the reality of shared ones.

                                    The xkcd “sheeple” comic was given to the contrarian in this thread, but it applies to this too. Your hubris is just less overt and more civilized, but almost as off-putting from my perspective.

                                    1. 3

                                      Your hubris…

                                      Can you please elaborate?

                                      I just noticed two issues that a union in Google would face: a cultural bias and the legislative/economic complexity due to the multinational nature of the company.

                                      The cultural bias against the unions is well known in the USA, mainly because of the country narrative is based on the competition and on the solo hero/entrepreneur. In other states the country narrative value cooperation a lot more, for example in families (that are large group of people), in tribes, in church and so on…
                                      This is an historical and sociological observation that could be false, but doesn’t look like hubris.

                                      The multinational observation is also self evident: in different states around the world, laws are different, cultures are different, economical issues are different and so on… the tendency to only focus in the local space could be exploited by the ownership of the company to divide and conquer the employees.

                                      Finally, while I do not like the Google’s corporate culture that I’ve found among engineers, I acknowledge they hire very smart people and what happened with Project Maven show how much they care about their work.

                                      So I think they have an unique chance to overcome these difficulties.

                                      The fact is that in a company that has built an internal narrative based on knowledge and intelligence, now face the fact that the employees hold collectively more knowledge and intelligence than the ownership by several orders of magnitude.

                                      If knowledge is power, in Google, the employees are more powerful than the ownership.

                                      They still call the ownership as “leadership”, but they will soon realize they are the true leaders there.

                                      1. -2

                                        Your hubris is in the implication that individualism is not a reasonable stance to hold. Why would anyone knowingly choose “promises” over reality?

                                        I’m not interested in having a long drawn conversation with you on this topic. I’m just saying, as someone who disagrees with you, your comment is deserving of the “sheeple” xkcd. You might want to reconsider your approach.

                                        1. -1

                                          Your hubris is …

                                          I’m not interested in having a long drawn conversation with you …

                                          your comment is deserving of the “sheeple” xkcd …

                                          You might want to reconsider your approach

                                          Yo, post something worthwhile or don’t post at all.

                                1. 2

                                  That’s rich, from a guy who done his best to advance client-server cloud model in his time.

                                  OK, not really happy about the acquisition either, but overall GitHub has been a massive boon to the community in general. It lowered the threshold to collaboration, publishing your projects and facilitated a bunch of dependency fetching ecosystems with much higher availability than was possible before.

                                  1. 5

                                    How did he do that?

                                    I thought he was involved in writing Netscape Navigator browser and its mail component neither of which promote cloud model.

                                    1. 2

                                      You posted that comment using a web browser which identifies itself as “Mozilla” and a cloud-hosted application called “lobste.rs”. IMO it’s fair to say that someone who was both a primary author of Mozilla-the-browser and a founder of mozilla.org was involved in enabling, even promoting the model lobste.rs uses.

                                      1. 2

                                        This is basically an argument that the web itself or really any client-server approach is promoting cloud model which I find absurd. Cloud-hosted wasn’t a technologically inevitable outcome as you could build something similar to email. You still can as you can use those same technologies JWZ help building to run your stuff on your own hardware.

                                        I don’t remember either JWZ or Mozilla in his time promoting running stuff in cloud (other people’s computers).

                                        1. 1

                                          He wrote software that made it feasible to put even user interface code on a server running in a colo somewhere. The UI on such software was primitive and laggy compared to using alternatives like MFC or Qt, but on the other hand a webapp didn’t have to be purchased, downloaded or installed.

                                          I don’t recall him saying that anyone should write webapps. But he wrote software that made it feasible, and did his best to get that software installed everywhere.

                                          1. -1

                                            Other people’s computers? You make it sound like a P2P network. I know zero cloud services hosted on other people’s computers, as opposed to other corporations.

                                            Oh and funny how email was decentralized right until its consolidation as browser-based client-server (sorry, cloud) platforms.

                                            1. 3

                                              “Other people’s computers” is a popular description of where cloud-hosted apps run. I don’t think anyone, certainly not me, means P2P by that.

                                              Email is still decentralized. You can run your own server as I and many others do. It can also have a webmail interface like mine does and that has been true for 2 decades. The fact that users are consolidating on few providers does not make underlying technology more “cloudy” and that did not happen for the first decade also strongly suggest that change did not happen because of underlying (web) technology.

                                              1. 1

                                                What share of the world’s email has to be stored in a single database before you consider it centralised? 50% perhaps?

                                                Google alone hosts a two-digit percentage of email users. I’ve heard the number 25% mentioned. Assuming one From address per message, an average of 1.4 To/Cc addresses and a 25% market share for Google, Google stores 50% of the email that was sent yesterday on behalf of the sender or any recipient. I self-host, so Google stores about 33% of my email.

                                                (I made up the number 1.4. I don’t really care about the precise details. And I don’t care about whether you want to consider just Google or the also the next ten big hosters.)

                                                1. 1

                                                  This debate has moved far away from JWZ and cloud to what feels off topic to main theme (Github+MS).

                                                  Since you asked, I have no idea what percentage of contained data if any should be a limit at which something counts as centralized. I think your question reveals and underlying dilemma which is are we talking about effectively centralized in a sense that for all intents and purposes everything happens at one place, or actually centralized in a sense, that it can’t happen elsewhere.

                                                  Clearly in the second sense email is not centralized as one can demonstrably run their own server as still so many do without penalties as long as the server is properly configured. It might not make economic or otherwise sense, but at least for now you are not technologically locked out.

                                                  I don’t think it is centralized in the first sense either and I am not sure your metric is valid. In that sense the whole web is already centralized or was, as Google scrapped everything public so in a way it stored close to all of it. Let’s imagine that we are left only with two email providers of approximately equal size and usage pattern. Then by your approach each of them will contain more or less all email and yet neither of which would actually be in a position where everyone had to be.

                                                  And to bring this closer to thread’s original topic, I don’t think any of this has much to do with web as such. It happened because costs of running your own server did not fall like the cost of hosted accounts which also provided a degree of freedom compared to ISP’s or company’s. What web did do, as it improved, is change client that is used to access email as there was less need for native OS ones. And even that is not completely true since Gmail has native client both for Android and iOS.

                                                  I think we would move to “cloud” services over time even if web did not exist or remained limited to HTML2. We would just be using Windows apps to do so.

                                    1. 23

                                      Kinda late on UNIX bashing bandwagon :)

                                      Also, Windows owes more of it’s legacy to VMS.

                                      1. 10

                                        It does, but users don’t use any of the VMSy goodness in Windows: To them it’s just another shitty UNIX clone, with everything being a file or a program (which is also a file). I think that’s the point.

                                        Programmers rarely even use the VMSy goodness, especially if they also want their stuff to work on Mac. They treat Windows as a kind of retarded UNIX cousin (which is a shame because the API is better; IOCP et al)

                                        Sysadmins often struggle with Windows because of all the things underneath that aren’t files.

                                        Message/Object operating systems are interesting, but for the most part (OS/2, BeOS, QNX) they, for the most part, degraded into this “everything is a file” nonsense…

                                        Until they got rid of the shared filesystem: iOS finally required messaging for applications to communicate on their own, and while it’s been rocky, it’s starting to paint a picture to the next generation who will finally make an operating system without files.

                                        1. 10

                                          If we talk user experiences, it’s more a CP/M clone than anything. Generations later, Windows still smells COMMAND.COM.

                                          1. 6

                                            yes, the bowels are VMS, the visible stuff going out is CP/M

                                            1. 4

                                              Bowels is a good metaphor. There’s good stuff in Windows, but you’ve got to put on a shoulder length glove and grab a vat of crisco before you can find any of it.

                                          2. 10

                                            I think you’re being a little bit harsh. End-users definitely don’t grok the VMSy goodness; I agree. And maybe the majority of developers don’t, either (though I doubt the majority of Linux devs grok journald v. syslogs, really understand how to use /proc, grok Linux namespaces, etc.). But I’ve worked with enough Windows shops to promise you that a reasonable number of Windows developers do get the difference.

                                            That said, I have a half-finished book from a couple years ago, tentatively called Windows Is Not Linux, which dove into a lot of the, “okay, I know you want to do $x because that’s how you did it on Linux, and doing $x on Windows stinks, so you think Windows stinks, but let me walk you through $y and explain to you why it’s at least as good as the Linux way even though it’s different,” specifically because I got fed up with devs saying Windows was awful when they didn’t get how to use it. Things in that bucket included not remoting in to do syswork (use WMI/WinRM), not doing raw text munging unless you actually have to (COM from VBScript/PowerShell are your friends), adapting to the UAC model v. the sudo model, etc. The Windows way can actually be very nice, but untraining habits is indeed hard.

                                            1. 6

                                              I don’t disagree with any of that (except maybe that I’m being harsh), but if you parse what I’m saying as “Windows is awful” then it’s because my indelicate tone has been read into instead of my words.

                                              The point of the article is that those differences are superficial, and mean so very little to the mental model of use and implementation as to make no difference: IOCP is just threads and epoll, and epoll is just IOCP and fifos. Yes, IOCP is better, but I desperately want to see something new in how I use an operating system.

                                              I’ve been doing things roughly the same way for nearly four decades, despite the fact that I’ve done Microsoft/IBM for a decade, Linux since Slackware 1.1 (Unix since tapes of SCO), Common Lisp (of all things) for a decade, and OSX for nearly that long. They’re all the same, and that point is painfully clear to anyone who has actually used these things at a high level: I edit files, I copy files, I run programs. Huzzah.

                                              But: It’s also obvious to me who has gone into the bowels of these systems as well: I wrote winback which was for a long time the only tools for doing online Windows backups of standalone exchange servers and domain controllers; I’m the author of (perhaps) the fastest Linux webserver; I wrote ml a Linux emulator for OSX; I worked on ECL adding principally CL exceptions to streams and the Slime implementation. And so on.

                                              So: I understand what you mean when you say Windows is not Linux, but I also understand what the author means when he says they’re the same.

                                              1. 2

                                                That actually makes a ton of sense. Can I ask what would qualify as meaningfully different for you? Oberon, maybe? Or a version of Windows where WinRT was front-and-center from the kernel level upwards?

                                                1. 2

                                                  I didn’t use the term “meaningfully different”, so I might be interpreting your question you too broadly.

                                                  When I used VMS, I never “made a backup” before I changed a file. That’s really quite powerful.

                                                  The Canon Cat had “pages” you would scroll through. Like other forth environments, if you named any of your blocks/documents it was so you could search [leap] for them, not because you had hierarchy.

                                                  I also think containers are very interesting. The encapsulation of the application seems to massively change the way we use them. Like the iOS example, they don’t seem to need “files” since the files live inside the container/app. This poses some risk for data portability. There are other problems.

                                                  I never used Oberon or WinRT enough to feel as comfortable commenting about them as I do about some of these other examples.

                                              2. 2

                                                If it’s any motivation I would love to read this book.

                                                Do you know of any books or posts I could read in the meantime? I’m very open to the idea that Windows is nice if you know which tools and mental models to use, but kind of by definition I’m not sure what to Google to find them :)

                                                1. 4

                                                  I’ve just been hesitant because I worked in management for two years after I started the book (meaning my information atrophied), and now I don’t work with Windows very much. So, unfortunately, I don’t immediately have a great suggestion for you. Yeah, you could read Windows Internals 6, which is what I did when I was working on the book, but that’s 2000+ pages, and most of it honestly isn’t relevant for a normal developer.

                                                  That said, if you’ve got specific questions, I’d love to hear them. Maybe there’s a tl;dr blog post hiding in them, where I could salvage some of my work without completing the entire book.

                                              3. 7

                                                but users don’t use any of the VMSy goodness in Windows: To them it’s just another shitty UNIX clone, with everything being a file or a program (which is also a file). I think that’s the point.

                                                Most users don’t know anything about UNIX and can’t use it. On the UI side, pre-NT Windows was a Mac knockoff mixed with MSDOS which was based on a DOS they got from a third party. Microsoft even developed software for Apple in that time. Microsoft’s own users had previously learned MSDOS menu and some commands. Then, they had a nifty UI like Apple’s running on MSDOS. Then, Microsoft worked with IBM to make a new OS/2 with its philosophy. Then, Microsoft acquired OpenVMS team, made new kernel, and a new GUI w/ wizard-based configuration of services vs command line, text, and pipes like in UNIX.

                                                So, historically, internally, layperson-facing, and administration, Windows is a totally different thing than UNIX. Hence, the difficulty moving Windows users to UNIX when it’s a terminal OS with X Windows vs some Windows-style stuff like Gnome or KDE.

                                                You’re also overstating the everything is a file by conflating OS’s that store programs or something in files vs those like UNIX or Plan 9 that use file metaphor for about everything. It’s a false equivalence: from what I remember, you don’t get your running processes in Windows by reading the filesystem since they don’t use that metaphor or API. It’s object based with API calls specific to different categories. Different philosophy.

                                                1. 3

                                                  Bitsavers has some internal emails from DEC at the time of David Cutler’s departure.

                                                  I have linked to a few of them.

                                                  David Cutler’s team at DECwest was working on Mica (an operating system) for PRISM (a RISC CPU architecture). PRISM was canceled in June of 1988. Cutler resigned in August of 1988 and 8 other DECwest alumni followed him at Microsoft.

                                              4. 5

                                                I have my paper copy of The Unix Hater’s Handbook always close at hand (although I’m missing the barf bag, sad to say).

                                                1. 5

                                                  I always wanted to ask the author of The Unix Hater’s Handbook if he’s using Mac OS X

                                                  8~)

                                                  1. 5

                                                    It was edited by Simson Garfinkel, who co-wrote Building Cocoa Applications: a step-by-step guide. Which was sort of a “port” of Nextstep Programming Step One: object-oriented applications

                                                    Or, in other words, “yes” :)

                                                    1. 2

                                                      Add me to the list curious about what they ended up using. The hoaxers behind UNIX admitted they’ve been coding in Pascal on Macs. Maybe it’s what the rest were using if not Common LISP on Macs.

                                                  2. 7

                                                    Beat me to it. Author is full of it right when saying Windows is built on UNIX. Microsoft stealing, cloning, and improving OpenVMS into Windows NT is described here. This makes the Linux zealots’ parodies about a VMS desktop funnier given one destroyed Linux in desktop market. So, we have VMS and UNIX family trees going in parallel with the UNIX tree having more branches.

                                                    1. 4

                                                      The author doesn’t say Windows is built on Unix.

                                                      1. 5

                                                        “we are forced to choose from: Windows, Apple, Other (which I shall refer to as “Linux” despite it technically being more specific). All of these are built around the same foundational concepts, those of Unix.”

                                                        Says it’s built on the foundational concepts of UNIX. It’s built on a combo of DOS, OS/2, OpenVMS, and Microsoft concepts they called the NT kernel. The only thing UNIX-like was the networking stack they got from Spider Systems. They’ve since rewritten their networking stack from what I heard.

                                                        1. 4

                                                          Says it’s built on the foundational concepts of UNIX.

                                                          I don’t see any reason to disagree with that.

                                                          The only thing UNIX-like …

                                                          I don’t think that’s a helpful definition of “unix-like”.

                                                          It’s got files. Everything is a file. Windows might even be a better UNIX than Linux (since UNC)

                                                          Cutler might not have liked UNIX very much, but Windows NT ended up UNIX anyway because none of that VMS-goodness (Versions, types, streams, clusters) ended up in the hands of Users.

                                                          1. 10

                                                            It’s got files. Everything is a file.

                                                            Windows is object-based. It does have files which are another object. The files come from MULTICS which UNIX also copied in some ways. Even the name was a play on it: UNICS. I think Titan invented the access permissions. The internal model with its subsystems were more like microkernel designs running OS emulators as processes. They did their own thing for most of the rest with the Win32 API and registry. Again, not quite how a UNIX programming guide teaches you to do things. They got clustering later, too, with them and Oracle using the distributed, lock approach from OpenVMS.

                                                            Windows and UNIX are very different in approach to architecture. They’re different in how developer is expected to build individual apps and compose them. It wasn’t even developed on UNIX: they used OS/2 workstations for that. There’s no reason to say Windows is ground in the UNIX philosophy. It’s a lie.

                                                            “Windows NT ended up UNIX anyway because none of that VMS-goodness (Versions, types, streams, clusters) ended up in the hands of Users.”

                                                            I don’t know what you’re saying here. Neither VMS nor Windows teams intended to do anything for UNIX users. They took their own path except for networking for obvious reasons. UNIX users actively resisted Microsoft tech, too. Especially BSD and Linux users that often hated them. They’d reflexively do the opposite of Microsoft except when making knockoffs of their key products like Office to get desktop users.

                                                            1. 3

                                                              Windows is object-based.

                                                              Consider what methods of that “object” a program like Microsoft Word must be calling besides “ReadFile” and “WriteFile”.

                                                              That the kernel supports more methods is completely pointless. Users don’t interact with it. Programmers avoid it. Sysadmins don’t understand it and get it wrong.

                                                              I don’t know what you’re saying here.

                                                              That is clear, and yet you’re insisting I’m wrong.

                                                              1. 3

                                                                Except, that’s completely wrong.

                                                                I just started Word and dumped a summary of its open handles by object type:

                                                                C:\WINDOWS\system32>handle -s -p WinWord.exe
                                                                
                                                                Nthandle v4.11 - Handle viewer
                                                                Copyright (C) 1997-2017 Mark Russinovich
                                                                Sysinternals - www.sysinternals.com
                                                                
                                                                Handle type summary:
                                                                  ALPC Port       : 33
                                                                  Desktop         : 1
                                                                  Directory       : 3
                                                                  DxgkSharedResource: 2
                                                                  DxgkSharedSyncObject: 1
                                                                  EtwRegistration : 324
                                                                  Event           : 431
                                                                  File            : 75
                                                                  IoCompletion    : 66
                                                                  IoCompletionReserve: 1
                                                                  IRTimer         : 8
                                                                  Key             : 171
                                                                  KeyedEvent      : 24
                                                                  Mutant          : 32
                                                                  Process         : 2
                                                                  Section         : 67
                                                                  Semaphore       : 108
                                                                  Thread          : 138
                                                                  Timer           : 7
                                                                  Token           : 3
                                                                  TpWorkerFactory : 4
                                                                  WaitCompletionPacket: 36
                                                                  WindowStation   : 2
                                                                Total handles: 1539
                                                                

                                                                Each of these types is a distinct kernel object with its own characteristics and semantics. And yes, you do create and interact with them from user-space. Some of those will be abstracted by lower-level APIs, but many are directly created and managed by the application. You’ll note the number of open “files” is a very small minority of the total number of open handles.

                                                                Simple examples of non-file object types commonly manipulated from user-land include Mutants (CreateMutex) and Semaphores (CreateSemaphore). Perhaps the most prominent example is manipulating the Windows Registry; this entails opening “Key” objects, which per above are entirely distinct from regular files. See the MSDN Registry Functions reference.

                                                                1. 0

                                                                  None of these objects can exist on a disk; they cannot persist beyond shutdown, and do not have any representation beyond their instantaneous in-memory instance. When someone wants an “EtwRegistration” they’re creating it again and again.

                                                                  Did you even read the article? Or are you trolling?

                                                                  1. 3

                                                                    None of these objects can exist on a disk; they cannot persist beyond shutdown, and do not have any representation beyond their instantaneous in-memory instance. When someone wants an “EtwRegistration” they’re creating it again and again.

                                                                    Key objects do typically exist on disk. Albeit, the underlying datastore for the Registry is a series of files, but you never directly manipulate those files. In the same sense you may ask for C:\whatever.txt, you may ask for HKLM:\whatever. We need to somehow isolate the different persisted data streams, and that isolation mechanism is a file. That doesn’t mean you have to directly manipulate those files if the operating system provides higher-level abstractions. What exactly are you after?

                                                                    From the article:

                                                                    But in Unix land, this is a taboo. Binary files are opaque, say the Unix ideologues. They are hard to read and write. Instead, we use Text Files, for it is surely the path of true righteousness we have taken.

                                                                    The Windows Registry, which is a core part of the operating system, is completely counter to this. It’s a bunch of large binary files, precisely because Microsoft recognised storing all that configuration data in plain text files would be completely impractical. So you don’t open a text file and write to it, you open a Registry key, and store data in it using one of many predefined data types (REG_DWORD, etc…).

                                                                    Did you even read the article? Or are you trolling?

                                                                    It sounds like you’re not interested in a constructive and respectful dialogue. If you are, you should work on your approach.

                                                                    1. -3

                                                                      What exactly are you after?

                                                                      Just go read the article.

                                                                      It’s about whether basing our entire interactions with a computer on a specific reduction of verbs (read and write) is really exploring what the operating system can do for us.

                                                                      That is a very interesting subject to me.

                                                                      Some idiot took party to the idea that Windows basically “built on Unix” then back-pedalled it to be about whether it was based on the same “foundational” concepts, then chooses to narrowly and uniquely interpret “foundational” in a very different way than the article.

                                                                      Yes, windows has domains and registries and lots of directory services, but they all have the exact same “file” semantics.

                                                                      But now you’re responding to this strange interpretation of “foundational” because you didn’t read the article either. Or you’re a troll. I’m not sure which yet.

                                                                      Read the article. It’s not well written but it’s a very interesting idea.

                                                                      Each of these types is a distinct kernel object with its own characteristics and semantics

                                                                      Why do you bring this up in response to whether Windows is basically the same as Unix? Unix has lots of different kernel “types” all backed by “handles”. Some operations and semantics are shared by handles of different types, but some are distinct.

                                                                      I don’t understand why you think this is important at all.

                                                                      It sounds like you’re not interested in a constructive and respectful dialogue. If you are, you should work on your approach.

                                                                      Do you often jump into the middle of a conversation with “Except, that’s completely wrong?”

                                                                      Or are you only an asshole on the Internet?

                                                                      1. 4

                                                                        Or are you only an asshole on the Internet?

                                                                        I’m not in the habit of calling people “asshole” anywhere, Internet or otherwise. You’d honestly be more persuasive if you just made your points without the nasty attacks. I’ll leave it at that.

                                                              2. 2

                                                                networking for obvious reasons

                                                                Them being what? Is the BSD socket API really the ultimate networking abstraction?

                                                                1. 7

                                                                  The TCP/IP protocols were part of a UNIX. AT&T gave UNIX away for free. They spread together with early applications being built on UNIX. Anyone reusing the protocols or code will inherit some of what UNIX folks were doing. They were also the most mature networking stacks for that reason. It’s why re-using BSD stacks was popular among proprietary vendors. On top of the licensing.

                                                                  Edit: Tried to Google you a source talking about this. I found one that mentions it.

                                                    1. 2

                                                      Has some mitigation of Spectre/Meltdown styled OOE attacks, apparently.

                                                      1. 1

                                                        I was really hoping for them to announce new Macbooks. I’ve been considering switching from Arch Linux to macOS, mostly because that’s one platform where people still “buy” apps. So as a developer, that could be an interesting platform for going the indie route.

                                                        The current lineup looks strange though. The Air doesn’t have a retina screen and the RAM is stuck at 8GB. And the Macbook Pros (even the one without the silly touch bar) are so insanely expensive, it doesn’t even make sense.

                                                        Let’s see when they have the next event.

                                                        1. 1

                                                          FWIW I use 12” MB with entry-level CPU and topped to 16Gb RAM. Fine so far. No touch bar, retina screen, great to take places, and fanless to boot.

                                                        1. 4

                                                          Work, creating devicetree definitions for a new board.

                                                          Hobby, learning how to make an app with Swift/10code.

                                                          1. 4

                                                            Lots of C bashing going on here.

                                                            I’ll only comment that C is used today mainly in the embedded domain, a place where it is strong and growing (in terms of jobs etc).

                                                            1. 1

                                                              Perhaps webassembly will bring it out to the frontend!

                                                              1. 3

                                                                Certainly, WebAssembly is bringing a lot of good existing C/C++ code to the frontend. In a personal project, I’m using libogg, libopus and libspeexdsp. I find it really cool to be able to use these from the web! (I guess these particular libs lend themselves well, because they have little interaction with the OS, and are very portable.)

                                                                And then there’re also the big names in game development porting their engines, of course.

                                                            1. 2

                                                              Wrapping two legacy embedded system projects into virtual machine images for future preservation.

                                                              1. 1

                                                                So elisp is getting green threads, interesting. Wonder if that’s the end goal or are there plans for transition to native threads?

                                                                1. 2

                                                                  I guess I should be thankful that my circuit breaker just cuts off electricity when there is too much load so it is like a forced reboot at least once a month.

                                                                  The FBI recommends any owner of small office and home office routers reboot the devices to temporarily disrupt the malware and aid the potential identification of infected devices. Owners are advised to consider disabling remote management settings on devices and secure with strong passwords and encryption when enabled. Network devices should be upgraded to the latest available versions of firmware.

                                                                  I would like to learn more about this. I am pretty sure Verizon has a backdoor to my WiFi router FiOS-G1100. Does anyone else have this router? What do you see when you go to http://myfiosgateway.com/#/monitoring ? I see

                                                                  UI Version: v1.0.294 Firmware Version 02.00.01.08 Model Name: FiOS-G1100 Hardware Version: 1.03

                                                                  1. 2

                                                                    Access to your router is likely not publicly routed. I can’t access that web page (connection failed).

                                                                    1. 1

                                                                      Ah, I should have mentioned you need to be at home behind your FiOS F1100 router, log in and click on system monitoring on the top right corner.

                                                                      Here’s the router/modem in question: https://www.verizon.com/home/accessories/fios-quantum-gateway/

                                                                    2. 1

                                                                      Why do you think Verizon has a backdoor?

                                                                      1. 2

                                                                        They along with other ISP’s took tens to hundreds of millions to backdoor their networks for NSA. That was in leaks. You should assume they might backdoor anything else.

                                                                        1. 1

                                                                          Got a link to the specific leaks?

                                                                          1. 1

                                                                            Forbes article.

                                                                        2. 2

                                                                          Once man’s backdoor is another man’s mass provisioning service.

                                                                          1. 1

                                                                            Maybe I used an incorrect technical word. I meant to say I think they can remotely access and configure the modem / router.

                                                                            1. 1

                                                                              ISP’s backdooring home routers isn’t unknown, where here I use ‘backdooring’ to mean “ISP can log in and make changes even though most home users don’t know they can do this”. Some use it to push out router firmware updates (for their preferred models).

                                                                          1. 1

                                                                            Bonus points for using nyan-mode!

                                                                            1. 1

                                                                              I remember they had all these shows in the 90’s talking about how everything was going to be like the Jetsons by year 2000. We’d have personal robots, flying cars, and so on. Instead, they spent $150 billion rewriting COBOL since the year 2000’s computers can’t keep track of time properly. I was so let down.

                                                                              1. 2

                                                                                The TL;DR of the article: all personal robots in the 80s were a major letdown except the one that was least successful, because it cost as much as a new car & didn’t have any arms.

                                                                                1. 2

                                                                                  I think home robots are kind of a letdown now, in the sense that Roombas that you can actually go out and buy today aren’t good enough at navigating the floor to be better at vacuuming than me with a canister vac, 3D printers are a niche hobbyist thing, and Alexa is so privacy-invasive I refuse to use it out of principle.

                                                                                  Of course, half the things that people imagined home robots would do in the 80s - basically all the things that you can do without the device physically moving - have been accomplished by personal computers and smartphones, to wild success. I can use my phone and ubiquitous wireless internet connection to play music or read encyclopedias or shitpost about cryptocurrencies, all of which I’m sure would blow the minds of the 80s people typing things into the keyboard attached to their 80s home robot.

                                                                                  But my phone and computer don’t move on their own so, I wouldn’t call them robots.

                                                                                  1. 2

                                                                                    When you follow Google Maps directions or steer to high ranked restaurants or attend meetings in your calendar they do move on their volition. Kind of.

                                                                                    1. 2

                                                                                      Heh, yeah. Exactly in that restricted sense of “move around” that you say. That’s the sense of “move around” that turned out to be useful.

                                                                                1. 2

                                                                                  TBH sounds apocryphal. The system was in a building away from the tracks, no living thing can give out that amount of ionizing radiation while still, well, living.

                                                                                1. 4

                                                                                  Honestly, the driving script in bash feels like it’s cheating. It actually feels like despite all the new toys, metaprogramming in C++ still isn’t that powerful.

                                                                                  Couldn’t this be done more easily with lisp macros? I can sort of see how to do it with D compile-time structures.

                                                                                  1. 3

                                                                                    I don’t think there’s much point comparing such exercises across languages. For instance, with Template Haskell, you can run arbitrary Haskell code and even do IO at compile time, you could even write a 3D shooter, but I’d still say C++ templates are more powerful than TH in many aspects, due to the way they interact with the rest of their respective languages.

                                                                                    1. 1

                                                                                      Maybe I shouldn’t have said “powerful”, but “convenient”? I think it does make sense to have these comparisons at least for this example. In both Lisp and D, you have all of the language at compile time, so you can do just about anything.

                                                                                      It appears that even when attempting a ridiculous feat, thus accepting some inconvenience, C++ compile-time features are still too onerous to put the whole game loop into them.

                                                                                      Edit: After thinking about this for a second, I’m not sure it’s possible in D anymore since compile-time D functions have to be deterministic.

                                                                                      1. 2

                                                                                        I understand your point about the convenience, but my point is that the real purpose of the metaprogramming features isn’t to write interactive games. What matters is how it interacts with the run-time features. For instance, C++ templates are more powerful than Template Haskell, because of template argument deduction and due to how template instantiation can cause other templates to be instantiated seamlessly. Whereas in TH, you cause all template expansions by hand. Without considering the interaction with the rest of the language, the best metaprogramming would simply be generating C++ code using C++, then running that program as a preprocessing step. That’s why I think comparing the power of metaprogramming features accross languages through non-metaprogramming things you can do with them is pointless.

                                                                                        1. 1

                                                                                          Ah, it does sound inconvenient in TH to not have automatic instantiations.

                                                                                          1. 1

                                                                                            Yeah, it is, TH is much more bolted-on in Haskell compared to templates in C++, but on the other hand, Haskell’s type system is vastly more powerful without metaprogramming, so you rarely really need it. As I said, hard to compare across languages :)

                                                                                    2. 2

                                                                                      In Lisp you have the full language in disposal at compile-time, so it’s way too easy.

                                                                                      1. 1

                                                                                        That was my first thought, that the actual game loop is still implemented at runtime (with a bash runtime), which is sort of cheating. On the other hand, since one of my research areas is modeling game mechanics in formal logic, it somehow feels natural to accept an implementation of a state->state' transition function as morally equivalent to an implemention of a game. :-)

                                                                                      1. 3

                                                                                        Nice! Really like the minimalist approach of it.

                                                                                        1. 3

                                                                                          Thanks!

                                                                                          I’m certainly aiming for a minimal front-end. As I write new features, I try to hold to a quote I once heard: “if it needs a manual to work, it’s not ready for production”. Obviously there are exceptions to this, but the spirit of the saying is that features should be as intuitive as possible. So I’m comfortable with the backend code getting sophisticated (and ideally not complicated) as long as writing the tests remains straightforward. I measure straightforwardness by how easy it is to explain a new feature using an example. If it’s difficult to explain using an example, it’s not ready

                                                                                        1. 4

                                                                                          I’ve read in several places that the design was totally different from anything that had previously existed because Woz knew nothing about floppy design when he started :)

                                                                                          Looking forward to reading this!

                                                                                          1. 5

                                                                                            Woz stresses there several times that this was achieved due to following bad practices (using the CPU in a carefully hand-coded mix of logic and timing).