1. 8

    The fact that you can’t apparently compose functions seems a bit disappointing for something that aims to mimic sql, but oh well. The only thing that seems really bad there is that in the example given, it returned a nonsensical result instead of an error.

    $ fselect "MIN(YEAR(modified)) from Downloads"

    This looks really really nice though because it looks like the syntax may be a lot easier to use and remember than find’s minilanguage, plus it looks like they’ve done a really good job of making output formats be helpful and easy.

    One of the things that’s endlessly painful with scripts that start with a find invocation is feeding filenames into pipelines that then get messed up by funny characters in filenames.

    Stoked to try this out later.

    1. 1

      You could use something like this, which doesn’t require a 5MB Rust binary and probably some insane dependency chain (tested on OpenBSD):

      find . -print0 | xargs -0 stat -f "%m%t%Sm%t%N" | sort -n | sed 1q | cut -f 2-
      1. 4

        Yeah, that is precisely the sort of thing I hate writing. What does that snippet do if one of the filenames has a newline in the middle of it? Nothing good, I’d wager.

        1. 2

          What does that snippet do if one of the filenames has a newline in the middle of it

          Wait, how often do you encounter filenames with characters like newlines in them? Is foo\n.bar really a thing?

          1. 2

            Anything except / or \0 is legal on some filesystems. Just dealing with spaces in shell scripts is already a disaster.


              Well, sure. I didn’t ask if it was possible, I asked how likely it was you would see files with \n in the name… I’ve never come across this, which is why I ask.

    1. 3

      I appreciate the author taking the time to detail all of this out, but I feel like the immense length of this article is a counterpoint to their previous article about how hosting your own email is “not hard”. I could easily see someone spending days trying to stumble through the process documented here, to figure out the right things to configure, etc. This article should help them with that though, but it did take something like this article to make it ‘not hard’…

      1. 3

        I think something like NixOS should be a great way for people to make it easy. I need to try out this project:



          I’ve been using it for a year or more, works great and is a total pleasure to set up in NixOS.

          The one little downside is that adding a new mail user requires a NixOS system switch (like any NixOS configuration change) and that takes a few seconds—which is a nonissue for most small mail servers.

        2. 1

          This excellent article goes into a lot of detail and explains the reason and its context for each setting, besides describing stuff only tangentially related, from configuring OpenBSDs httpd to acquire TLS certificates to screenshots of mail clients.

        1. 5

          Riding my bike around Oregon’s Crater Lake, as they have a car-free day!

          1. 2

            Damn, I wanted to do this but completely forgot. Have a good and safe ride!


              It was great! The weather was nice and probably the last warm day of the year. Should be good weather for the car-free day next Saturday too, but chillier.

          1. 13

            I was really confused about what was supposed to complete that sentence. After trying “John Carmack” and “admin@gmail.com” I finally figured it out by putting in “google” and getting an unrelated result:

            @font-face in CSS allows to include your own fonts inside an email.

            Of course, the really obvious guesses in this category of “bold”, “italics”, and “colors” don’t give any results either. I’m really questioning the primary presentation of this information. It seems to be much more suited to list format like on the features page.

            1. 2

              I still have no idea what this site is all about.

              1. 6

                It’s pretty much the same as [caniuse.com] (https://caniuse.com/) - website for frontend developers which lists browser compatibility for various CSS/Js features. Except this is not compatibility with browsers but email clients.

              2. 1

                I spent like 2-3 minutes on there before giving up… Still lost.

                1. 3

                  I spent like 2 minutes thinking about it and then finally guessing it might be about which HTML/CSS features are understood by email clients.

                  But only because I had to spend hours to do this, years ago.

                  It’s really, really badly communicated

              1. 4

                This was very informative. I had no idea that everything on the wire was encoded as MPEG-2 frames, including data/internet.

                1. 1

                  Yes, that’s the opposite of what I always assumed! Is this because the original plan was to have much dumber edge devices that are just selectors and decoders of MPEG-2 frames?

                  It reminds me of CD-ROMs, where as I remember it the original CD audio format was essentially usable in hardware (that is, with no “computer”-level hardware), then later had a data layer defined on top of it.

                1. 7

                  So I think the technical details of the integer overflow are kinda beside the point (it’s not titled “let’s learn about implicit integer conversions today” for that reason), but for reference…

                  -268435455 is less than 256, but (size_t)(-268435455 * 64) is 64.

                  1. 1

                    Maybe it’s not completely beside the point. Being able to recognize integer overflow situations would help protect against the ‘implicit backdoor’ situation in the article. You may not ‘fix’ the patch mentioned in the article with a situation that sets up overflow, for example.

                    But, that’s not the only way to ‘implicitly backdoor’ something.

                  1. 17

                    In my experience, a mail server is hard to set up but easy to maintain, and well worth the effort once completed.

                    1. 7

                      Completely agree. You get improved privacy and the ability to reject emails during the SMTP conversation. (What do you mean I’m subscribed to the Modbus newsletter? plonk) You also easily control your backup MXs for when your primary server goes down.

                      Trying to get support for your email in the cloud is not so easy either. I have a customer that uses Big Mailer Corp A for their email and found out of that Big Mailer Corp B marks all their emails as spam. They’ve been trying for two years to fix that one.

                      1. 2

                        Would you mind sharing what your setup looks like (if you haven’t already)? I suspect you may get a far larger volume of mail (and possibly spam too?) than most folks around here, but mainly I’m just curious what you run on what.

                        1. 6

                          I run several mail servers and between them it’s kind of a mishmash. I’ve found postfix, postgrey, and opendkim to be among the most reliable pieces from the bin. For spam I just use postgrey and DNSBLs and a manually-updated blacklist of senders.

                      1. 5

                        I’ve gone back and forth with this idea recently (inspired by another article that was posted here a few weeks back), but it still takes time to maintain yet another system (OS updates, app updates, dealing with quirks from updates, having a backup plan [and testing it periodically], etc). I’ve spent the last few years in the middle ground: paying a small email provider for service, and it’s likely I’ll stay here for a while longer.

                        1. 10

                          I find update maintenance on a mail server to be rather minimal.

                          Once you have a server, you can do all sorts of other things with it. For the same mostly fixed cost.

                          I would estimate 99% of my admin time is spent on something other than smtpd.

                          1. 3

                            Thanks, that’s encouraging. I already run a number of systems for personal services, etc. But email would probably need to be some VPS somewhere that’s more stable than sitting in my garage and sipping residential internet service.

                            1. 1

                              Unlike a web server, you have a few delay before to loose incoming mail, such as one day, in which all mailers will keep trying to send the mail to you again regularly.

                              After that delay, mailers progressively try less often.

                          2. 5

                            You’re probably referring to my article. Just so you know, my server’s been running perfectly ever since I set it up! I haven’t had any to do any kind of maintenance in the past 3 weeks or so. The initial setup is bit of a pain, I’ll give you that – a whole day’s work, for me anyway. But everything else is a breeze afterwards.

                            If you’re thinking about setting one up, I say go for it.

                            1. 2

                              I am! Thanks for the update. Yea that article has pushed me closer to giving it a go than anything I’ve read so far :)

                            2. 4

                              I run email, web, DNS and gopher on one (virtual) server. Like tedu, I have not really had to muck with mail. I don’t even have a backup MX record as I found it not worth the hassle. I have reverse DNS and SPF (I haven’t bothered with DKIM yet). I find that it pretty much just works.

                            1. 16

                              GitLab have had CI/CD for ages and it works great, I get that git hub ci is also nice but it feels overly hyped? Does it bring something that other providers lack?

                              1. 12

                                I don’t get how OP can write such a post and not mention GitLab. Reminds me of Apple’s habit of adopting old tech (e.g. NFC) and calling it “innovative”.

                                1. 1

                                  This reads a lot like a paid advertisement. Fail to consider alternatives that have existed for years? Check. Do not mention anything bad/negative? Check.

                                2. 5

                                  Yeah, doesn’t look to me like there’s anything that Gitlab hasn’t been doing for a while already. I guess I can understand the hype though, a few months I set up Gitlab CI for one of the project I’m working on at my current job and to somebody who’s never done this before it looks cool and exciting.

                                  1. 2

                                    Something that surprised me was the ability to schedule workflows to run regularly – it eliminated a cronjob from a VPS and keeps the schedule with the code.

                                    1. 5

                                      You can do this in gitlab-ci too.

                                      1. 1

                                        Neat! I had no idea.

                                1. 10

                                  I don’t use Github and only use Gitlab as a mirror. In general it’s better to avoid features which get you stuck to the platform in a manner where you can’t easily move away later.

                                  1. 3

                                    Since they were acquired by Microsoft, GitHub is doubling down on their “value-added” model. There should be a point where those additions should be standardised in some extent though, because that lock-in might become a big issue in the future.

                                    1. 6

                                      I don’t think it’s in microsoft’s best interest to ‘standardize’ with other CI services. They want to lock you in.

                                      1. 6

                                        There’s a book out there about how big change won’t occur until a disaster strikes. It might be “Lessons of Disaster” but I’m not sure if that was it. It was pretty convincing and gave good examples in history. Most importantly, the book showed how a lot of safety laws are implemented, not when people raise concerns, but after many people die from the lack of such laws. It takes a disaster to implement disaster preventions.

                                        I think that might happen to a lot of FOSS communities, where people talking about how it’s bad to get locked-in to a proprietor/vendor won’t be taken seriously (to the point of action) until disaster strikes. It probably won’t happen for a while and won’t be as dramatic, but I think there’s a good possibility that without standardization/decentralization, many will eventually be confronted with the pain that is vendor lock-in.

                                        I think Fossil has the right idea about including the issue tracker, wiki, etc. in the decentralized repos. I hope we see more solutions like that come up and see adoption.

                                        1. 2

                                          There should be a point where those additions should be standardised in some extent though, because that lock-in might become a big issue in the future.

                                          For you, or for the org tasked with maximizing the number of mouths at the feeding trough?

                                        2. 1

                                          features which get you stuck

                                          Are you talking about GitHub actions or GitLab CI here?

                                          Because I don’t think that is much of a problem for GitLab CI. Since your jobs are purely script based, it’s quite easy to transition to different platforms. Yes, you can create stages, job dependencies and what not, but still.

                                        1. 1

                                          Someone flagged this as ‘broken link’, but it works for me ¯_(ツ)_/¯

                                          1. 22

                                            I think people rely on JavaScript too much. With sourcehut I’m trying to set a good example, proving that it’s possible (and not that hard!) to build a useful and competitive web application without JavaScript and with minimal bloat. The average sr.ht page is less than 10 KiB with a cold cache. I’ve been writing a little about why this is important, and in the future I plan to start writing about how it’s done.

                                            In the long term, I hope to move more things out of the web entirely, and I hope that by the time I breathe my last, the web will be obsolete. But it’s going to take a lot of work to get there, and I don’t have the whole plan laid out yet. We’ll just have to see.

                                            I’ve been thinking about this a lot lately. I really don’t like the web from a technological perspective, both as a user and as a developer. It’s completely outgrown its intended use-case, and with that has brought a ton of compounding issues. The trouble is that the web is usually the lowest-common-denominator platform because it works on many different systems and devices.

                                            A good website (in the original sense of the word) is a really nice experience, right out of the box. It’s easy for the author to create (especially with a good static site generator), easy for nearly anyone to consume, doesn’t require a lot of resources, and can be made easily compatible with user-provided stylesheets and reader views. The back button works! Scrolling works!

                                            Where that breaks down is with web applications. Are server-rendered pages better than client-rendered pages? That’s a question that’s asked pretty frequently. You get a lot of nice functionality for free with server-side rendering, like a functioning back button. However, the web was intended to be a completely stateless protocol, and web apps (with things like session cookies) are kind of just a hack on top of that. The experience of using a good web app without JavaScript can be a bit of a pain with many different use cases (for example, upvoting on sites like this: you don’t want to force a page refresh, potentially losing the user’s place on the page). Security is difficult to get right when the server manages state.

                                            I’ll argue, if we’re trying to avoid the web, that client-side rendering (single-page apps) can be better. They’re more like native programs in that the client manages the state. The backend is simpler (and can be the backend for a mobile app without changing any code). The frontend is way more complex, but it functions similarly to a native app. I’ll concede poorly-built SPA is usually a more painful experience than a poorly-built SSR app, but I think SPAs are the only way to bring the web even close to the standard set by real native programs.

                                            Of course, the JavaScript ecosystem can be a mess, and it’s often a breath of fresh air to use a site like Sourcehut instead of ten megs of JS. The jury’s still out as to which approach is better for all parties.

                                            1. 11

                                              (for example, upvoting on sites like this: you don’t want to force a page refresh, potentially losing the user’s place on the page)

                                              Some of the UI benefits of SPA are really nice tbh. Reddit for example will have a notification icon that doesn’t update unless you refresh the page, which can be annoying. It’s nice when websites can display the current state of things without having to refresh.

                                              I can’t find the video, but the desire for eliminating stale UI (like outdated notifications) in Facebook was one of the reasons React was created in the first place. There just doesn’t seem to be a way to do things like that with static, js-free pages.

                                              The backend is simpler (and can be the backend for a mobile app without changing any code).

                                              I never thought about that before, but to me that’s a really appealing point to having a full-featured frontend design. I’ve noticed some projects with the server-client model where the client-side was using Vue/React, and they were able to easily make an Android app by just porting the server.

                                              The jury’s still out as to which approach is better for all parties.

                                              I think as always it depends. In my mind there are some obvious choices for obvious usecases. Blogs work great as just static html files with some styling. Anything that really benefits from being dynamic (“reactive” I think is the term webdevs use) confers nice UI/UX benefits to the user with more client-side rendering.

                                              I think the average user probably doesn’t care about the stack and the “bloat”, so it’s probably the case that client-side rendering will remain popular anytime it improves the UI/UX, even if it may not be necessary (plus cargo-culting lol). One could take it to an extreme and say that you can have something like Facebook without any javascript, but would people enjoy that? I don’t think so.

                                              1. 17

                                                But you don’t need to have a SPA to have notifications without refresh. You just need a small dynamic part of the page, which will degrade gracefully when JavaScript is disabled.

                                                Claim: Most sites are mostly static content. For example, AirBNB or Grubhub. Those sites could be way faster than they are now if they were architected differently. Only when you check out do you need anything resembling an “app”. The browsing and searching is better done with a “document” model IMO.

                                                Ditto for YouTube… I think it used to be more a document model, but now it’s more like an app. And it’s gotten a lot slower, which I don’t think is a coincidence. Netflix is a more obvious example – it’s crazy slow.

                                                To address the OP: for Sourcehut/Github, I would say everything except the PR review system could use the document model. Navigating code and adding comments is arguably an app.

                                                On the other hand, there are things that are and should be apps: Google Maps, Docs, Sheets.

                                                edit: Yeah now that I check, YouTube does the infinite scroll thing, which is slow and annoying IMO (e.g. breaks bookmarking). Ditto for AirBNB.

                                                1. 3

                                                  I’m glad to see some interesting ideas in the comments about achieving the dynamism without the bloat. A bit of Cunningham’s law in effect ;). It’s probably not easy to get such suggestions elsewhere since all I hear about is the hype of all the fancy frontend frameworks and what they can achieve.

                                                  1. 8

                                                    Yeah SPA is a pretty new thing that seems to be taking up a lot of space in the conversation. Here’s another way to think about it.

                                                    There are three ways to manage state in a web app:

                                                    1. On the server only (what we did in the 90’s)
                                                    2. On the server and on the client (sometimes called “progressive enhancement”, jQuery)
                                                    3. On the client only (SPA, React, Elm)

                                                    As you point out, #1 isn’t viable anymore because users need more features, so we’re left with a choice between #2 and #3.

                                                    We used to do #2 for a long time, but #3 became popular in the last few years.

                                                    I get why! #2 is is legitimately harder – you have to decide where to manage your state, and managing state in two places is asking for bugs. It was never clear if those apps should work offline, etc.

                                                    But somehow #3 doesn’t seem to have worked out in practice. Surprisingly, hitting the network can be faster than rendering in the browser, especially when there’s a tower of abstractions on top of the browser. Unfortunately I don’t have references at the moment (help appreciated from other readers :) )

                                                    I wonder if we can make a hybrid web framework for #2. I have seen a few efforts in that direction but they don’t seem to be popular.

                                                    edit: here are some links, not sure if they are the best references:



                                                    Oh yeah I think this is what I was thinking of. Especially on Mobile phones, SPA can be slower than hitting the network! The code to render a page is often bigger than the page itself! And it may or may not be amortized depending on the app’s usage pattern.





                                                    1. 3

                                                      A good example of #2 is Ur/Web. Pages are rendered server-side using templates which looks very similar to JSX (but without the custom uppercase components part) and similarly desugars to simple function calls. Then at any point in the page you can add a dyn tag, which takes a function returning a fragment of HTML (using the same language as the server-side part, and in some cases even the same functions!) that will be run every time one of the “signals” it subscribes to is triggered. A signal could be triggered from inside an onclick handler, or even from an even happening on the server. This list of demos does a pretty good job at showing what you can do with it.

                                                      So most of the page is rendered on the server and will display even with JS off, and only the parts that need to be dynamic will be handled by JS, with almost no plumbing required to pass around the state: you just need to subscribe to a signal inside your dyn tag, and every time the value inside changes it will be re-rendered automatically.

                                                      1. 2

                                                        Thanks a lot for all the info, really helpful stuff.

                                                    2. 5

                                                      Reddit for example will have a notification icon that doesn’t update unless you refresh the page, which can be annoying. It’s nice when websites can display the current state of things without having to refresh.

                                                      On the other hand, it can be annoying when things update without a refresh, distracting you from what you were reading. Different strokes for different folks. Luckily it’s possible to fulfill both preferences, by degrading gracefully when JS is disabled.

                                                      I think the average user probably doesn’t care about the stack and the “bloat”, so it’s probably the case that client-side rendering will remain popular anytime it improves the UI/UX, even if it may not be necessary (plus cargo-culting lol).

                                                      The average user does care that browsing the web drains their battery, or that they have to upgrade their computer every few years in order to avoid lag on common websites. I agree that we will continue see the expansion of heavy client-side rendering, even in cases where it does not benefit the user, because it benefits the companies that control the web.

                                                      1. 1

                                                        Some of the UI benefits of SPA are really nice tbh. Reddit for example will have a notification icon that doesn’t update unless you refresh the page, which can be annoying. It’s nice when websites can display the current state of things without having to refresh.

                                                        Is this old reddit or new reddit? The new one is sort of SPA and I recall it updating without refresh.

                                                        1. 3

                                                          Old reddit definitely has the issue I described, not sure about the newer design. If the new reddit doesn’t have that issue, that aligns with my experience of it being bloated and slow to load.

                                                      2. 12

                                                        example, upvoting on sites like this: you don’t want to force a page refresh, potentially losing the user’s place on the page

                                                        There are lots of ways to do this. Here’s two:

                                                        1. You can use an iframe for the upvote link, and have the state change just reload the frame.
                                                        2. If you don’t need feedback, you can also use a button with a target= to a hidden iframe.

                                                        Security is difficult to get right when the server manages state.

                                                        I would’ve thought the exact opposite. Can you explain?

                                                        1. 7

                                                          In the case where you have lots of buttons like that isn’t loading multiple completely separate doms and then reloading one or more of them somewhat worse than just using a tiny bit of js? I try to use as little as possible but I think that kind of dynamic interaction is the use case js originally was made for.

                                                          1. 7

                                                            Worse? Well, iframes are faster (marginally), but yes I’d probably use JavaScript too.

                                                            I think most NoScript users will download tarballs and run ./configure && make -j6 without checking anything, so I’m not sure why anyone wants to turn off JavaScript anyway, except for maybe because adblockers aren’t perfect.

                                                            That being said, I use NoScript…

                                                          2. 4

                                                            I’m not sure if this would work, but an interesting idea would be to use checkboxes that restyle when checked, and by loading a background image with a query or fragment part, the server is notified of which story is upvoted.

                                                            1. 2

                                                              That’d require using GET, which might be harder to prevent accidental upvotes. Could possibly devise something though.

                                                          3. 4

                                                            One thing I really miss with SPA’s (when used as apps), aside from performance, is the slightly more consistent UI/UX/HI that you generally get with desktop apps. Most major OS vendors, and most oss desktop toolkits, at least have some level of uniformity of expectation. Things like: there is a general style for most buttons and menu styles, there are some common effects (fade, transparency), scrolling behavior is more uniform.

                                                            With SPAs… well, good luck! Not only is it often browser dependent, but matrixed with a myriad JS frameworks, conventions, and render/load performance on top of it. I guess the web is certainly exciting, if nothing else!

                                                            1. 3

                                                              I consider the “indented use-case” argument a bit weak, since for the last 20 years web developers, browser architects and our tech overlords have been working on making it work for applications (and data collection), and to be honest it does so most of the time. They can easily blame the annoyances like pop-ups and cookie-banners on regulations and people who use ad blockers, but from a non technical perspective, it’s a functional system. Of course when you take a look underneath, it’s a mess, and we’re inclined to say that these aren’t real websites, when it’s the incompetence of our operating systems that have created the need to off-load these applications to a higher level of abstraction – something had to do it – and the web was just flexible enough to take on that job.

                                                              1. 4

                                                                You’re implying it’s Unix’s fault that the web is a mess but no other OS solved the problem either? Perhaps you would say that Plan 9 attempted to solve part of it, but that would only show that the web being what it is today isn’t solely down to lack of OS features.

                                                                I’d argue that rather than being a mess due to the incompetence of the OS it’s a mess due to the incremental adoption of different technologies for pragmatic reasons. It seems to be this way sadly, even if Plan 9 was a better Unix from a purely technological standpoint Unix was already so widespread that it wasn’t worth putting the effort in to switch to something marginally better.

                                                                1. 7

                                                                  No, I don’t think Plan 9 would have fixed things. It’s still fundamentally focused on text processing, rather than hypertext and universal linkability between objects and systems – ie the fundamental abstractions of an OS rather than just it’s features. Looking at what the web developed, tells us what needs were unformulated and ultimately ignored by OS development initiatives, or rather set aside for their own in-group goals (Unix was a research OS after all). It’s most unprobable that anyone could have foreseen what developments would take place, and even more that anyone will be able to fix them now.

                                                              2. 2

                                                                From reading the question of the interviewer I get the feeling that it’s easy for non technical users to create a website using wordpress. Adding many plugins most likely leads to a lot of bloaty JavaScript and CSS.

                                                                I would argue that it’s a good thing that non technical users can easily create website but the tooling to create it isn’t ideal. For many users a wysiwyg editor which generates a static html page would be fine but such a tool does not seem to exists or isn’t known.

                                                                So I really see this as a tooling/solution problem, which isn’t for users to solve but for developers to create an excellent wordpress alternative.

                                                                1. 2

                                                                  I am not affiliated to this in any way but I know of https://forestry.io/ which looks like what you describe. I find their approach quite interesting.

                                                                2. 0

                                                                  for example, upvoting on sites like this: you don’t want to force a page refresh, potentially losing the user’s place on the page)

                                                                  If a user clicks a particular upvote button, you should know where on that page it is located, and can use a page anchor in your response to send them back to it.

                                                                  1. 1

                                                                    It’s not perfectly seamless, sadly, and it’s possible to set up your reverse proxy incorrectly enough to break applications relying on various http headers to get exactly the right page back.

                                                                1. 42

                                                                  Microsoft ♥ Linux – we say that a lot, and we mean it!

                                                                  I’m calling bullshit on this. Microsoft ‘loves Linux’ so much that they’ve ignored requests to support Linux with Outlook/Word/Powerpoint/Teams/etc. Microsoft ‘loves Linux’ so much that they effectively killed Linux support on Skype. Microsoft ‘loves Linux’ so much that they prevent Skype from even working over the web interface on (arguably) the most popular browser used by folks on Linux (if you visit web.skype.com with Firefox you get redirected to this page: https://www.skype.com/en/unsupported-browser). Or do they only ‘love Linux’ when it suites their financial and PR interests?

                                                                  1. 19

                                                                    I’d like to add the lack of official linux drivers for their Microsoft-branded laptops to this list.

                                                                    1. 24

                                                                      do they only ‘love Linux’ when it suites their financial and PR interests?

                                                                      Well, obviously. Expecting any large corporation to “love” anything that’s not purely out of self-interest strikes me as rather naïve.

                                                                      Either way, I much prefer the current Microsoft over the “Linux is cancer” and “get the facts” Microsoft of 15 years ago.

                                                                      1. 10

                                                                        You can’t “love” something then actively ignore critical parts of it. A better slogan for what they are doing is “microsoft tolerates Linux.” I take issue with the fact that they are heavily implying that they are doing more than tolerating it now (when clearly they are not).

                                                                        1. 4

                                                                          Microsoft is making money off of Linux. They “love” it the only way a big profit-driven company can; they found a way to monetize people who actually like it.

                                                                          1. 4

                                                                            You can run Microsoft SQL Server on Linux, which seems like a lot more than “tolerating” it. Office has been ported to iOS and Android — I don’t see why they wouldn’t be porting it to Linux too, if there were sufficient demand. (The 2019 numbers I could find showed <5% market share for Linux, measured by web browser.)

                                                                            1. 8

                                                                              That still seems like toleration. I’m not convinced that if Linux hadn’t stuck around and/or expanded beyond microsoft’s wildest dreams, that they would still consider it a cancer. They may support Linux on a small subset of all software they pump out, but they ignore it on the vast majority. Can we at least agree that the ’microsoft loves Linux” slogan is pure marketing bullshit and not reflective of their actual behavior?

                                                                        2. 8

                                                                          if you visit web.skype.com with Firefox you get redirected to this page: https://www.skype.com/en/unsupported-browser)

                                                                          Wow, you actually do. What the fuck Microsoft?

                                                                          1. 12

                                                                            Or do they only ‘love Linux’ when it suites their financial and PR interests?

                                                                            Like any company, yes. They love Linux on Azure.

                                                                            1. 4

                                                                              I recently had to battle and debug some EWS/Azure/Exchange crap just to get evolution-ews working with Microsoft 2FA. Microsoft has supported Exchange+Evolution exactly 0%. It’s all gnome devs and other random volunteers figuring out how their broken OAuth2/Azure/Office365 rubbish works.

                                                                              1. 3

                                                                                Microsoft ‘loves Linux’ so much that they effectively killed Linux support on Skype.

                                                                                The Skype client for Linux works fine. Sure, it’s Electron and ugly, but so is the Mac version. But it does the job.

                                                                                (Sure, there are better and open solutions, but the outside world uses Skype.)

                                                                                1. 0

                                                                                  Microsoft ‘loves Linux’ so much that they’ve ignored requests to support Linux with Outlook/Word/Powerpoint/Teams/etc.

                                                                                  You can’t use the O365 versions on browsers on Linux?

                                                                                1. 3

                                                                                  You can probably get more insight by grabbing some deb, e.g.

                                                                                  apt-get download coreutils

                                                                                  and extracting it

                                                                                  dpkg --extract coreutils_8.26-3_amd64.deb /tmp/coreutils
                                                                                  1. 1

                                                                                    hwj e-mailed 2 days ago

                                                                                    Wait, you can interact with lobste.rs over email?!

                                                                                    edit: wow, you can, with ‘mailing list mode’! That’s amazing!

                                                                                    1. 1

                                                                                      Yes, this is something HN has not ;) I also use mutt to filter for posts I’m interested in.

                                                                                  1. 20

                                                                                    Also something I found out about recently, was the ability to quickly turn a curl command into a native executable via the –libcurl flag and a generated C file, see https://austingwalters.com/export-a-command-line-curl-command-to-an-executable for an example.

                                                                                    1. 1

                                                                                      This is neat, but why does it need: curl_easy_setopt(hnd, CURLOPT_SSH_KNOWNHOSTS, "/home/austin/.ssh/known_hosts"); ? That seems odd.

                                                                                      1. 1

                                                                                        If you build curl with libssh2, it works with SFTP!

                                                                                        Why the static home dir.. who knows.

                                                                                        1. 1

                                                                                          Looking at the source code, specifically src/tool_operate.c:1490, it seems like this is used for SCP/SFTP protocol support. As far as I can tell, curl reads known_hosts so it doesn’t bother you when accessing known hosts via those protocols. But don’t take my word for it, the source code is public and that’s amazing!

                                                                                          1. 1

                                                                                            Well, yes. The point that I’m making is that the URL is provided in the example, and can be trivially introspected to see that it’s an https:// not an scp / sftp URL. This would mean that setting SSH_KNOWNHOSTS is silly. Perhaps this is a “bug,” and it shouldn’t do this. Perhaps there’s some fun thing that requires this, even though I can see no legitimate reason for it… Free idea if you wanna get code into libcurl, I guess…

                                                                                      1. 6

                                                                                        I use an 8 year old Thinkpad X220 every day and the only upgrades are a solid state drive, linux and some extra RAM (which I need because some of my datasets are obnoxiously large and I keep too many tabs open).

                                                                                        It would be real nice to have a better screen, though.

                                                                                        1. 4

                                                                                          If you are comfortable with soldering, I highly recommend this mod to add support for a 1080p display. I did this mod about 9 months ago to my X230 and I’m so glad I did.

                                                                                          1. 1

                                                                                            I’ve let my X220 get so bashed up that I should probably buy a new one to do the mod in.

                                                                                            I suppose I could put the mod in this and then take it out again later if this laptop ever dies, but I’m kinda bad at soldering, so I’m not sure that’s likely ;)

                                                                                            1. 2

                                                                                              The last time I looked (~1 year ago) there were still a surprising number of X220 and X230 laptops for sale on ebay some as cheap as $100. I picked up an X230 to use for spare parts.

                                                                                              The mod was a little tricky to solder, especially getting the solder to sink down the through-holes to hit the pins sticking out of the motherboard. I ended up re-applying solder 3 times to finally get good connections there. The previous attempts seemed to work but then the display would cut out after using it for a few hours, or moving it, etc. Other than that, it has been solid ever since (I run Linux on it, not windows, so I cannot speak to the windows experience..)

                                                                                              1. 2

                                                                                                As if I use Windows ;)

                                                                                                Know about the ebay sales. Thanks for the insight on the soldering experience, though. Might come in handy.

                                                                                                1. 2

                                                                                                  Heh, I didn’t check your profile before replying. Sorry for insinuating you might use windows :P

                                                                                            2. 1

                                                                                              Thank you for posting this. A while back I had only seen the mod for the X220 and there was uncertainty about whether something similar would emerge for the X230. Finally I might be free of the only part of this laptop that I don’t care for…

                                                                                            3. 3

                                                                                              Thinkpad w530 here, great machine. Similar, SSD, add ram, linux. The Thinkpad driver for linux situation has alway been great.

                                                                                              1. 2

                                                                                                Adding an SSD makes a lot of sense, since the bottleneck for performance is rarely the CPU & usually the disk.

                                                                                              2. 3

                                                                                                Pretty much the same. 6 year old X230, no SSD, Linux, and extra RAM. The only thing I miss is a better screen. And I frequently use a much less powerful laptop than this one when I want to save some weight (an ASUS eeePC 1015).

                                                                                                1. 2

                                                                                                  My T420 has been great so far. They’re about $230-250 on eBay. Supports most OS’s.

                                                                                                1. 4

                                                                                                  Improve my personal security.

                                                                                                  I have changed all my passwords to diceware derived passwords, printed my private key and put it into a ziplock and then into a small lockable case, and created another pub/priv key pair that is signed by my original one, buried underground if I ever have the original stolen (along with its revocation key).

                                                                                                  All my passwords are stored in individual pgp encrypted files - via pass. I was about to re-invent this before someone told me about it, which are then backed up on remote sources.

                                                                                                  I want to buy another Trezor and create custom firmware so it’s a specialized pgp device (all signing happens on-device).

                                                                                                  I hope to continue to improve my personal pantry tracker. It’s a 1 file system - and that’s database, ui, and backend logic all included. I’m hoping it’ll inspire some people to create similar services.

                                                                                                  1. 1

                                                                                                    I hope to continue to improve my personal pantry tracker. It’s a 1 file system - and that’s database, ui, and backend logic all included. I’m hoping it’ll inspire some people to create similar services.

                                                                                                    This sounds interesting, can you elaborate on what it does and how?

                                                                                                    1. 1

                                                                                                      You will see more as the weeks come. :)

                                                                                                  1. 18

                                                                                                    Last weekend I went outside at night to capture photos of M31/Andromeda. After taking 102 photos and stacking them together (with a bunch of flats, bias and dark frames), I ended up with this: https://twitter.com/YorickPeterse/status/1165614942218805248.

                                                                                                    This week there are two things I will be working on:

                                                                                                    1. For Inko I really need to get some work done on porting the parser to Inko itself. I don’t really like writing parsers, so I have been slacking off a bit.
                                                                                                    2. Figuring out what exactly I would need to do to obtain higher quality images of Andromeda. I probably won’t go out again this weekend as I have other activities planned, but I want to at least be prepared for the next time.
                                                                                                    1. 1

                                                                                                      Wow! That’s frickin’ awesome! Love to hear more about this as you go along.

                                                                                                      1. 2

                                                                                                        In this case I think it took a total of three hours, including setting things up. I was initially going for a total exposure time of around 1 hour. Sadly, for some reason my camera refuses to take more than 10-or-so shots when using interval shooting, even when telling it to shoot 100 photos. This meant a lot of back and forth between my chair and the camera, accidentally messing up the focus in the process, etc.

                                                                                                        The post processing took around three hours, most of which was spent reading articles about how to do X, Y, Z in GIMP, as most tutorials assume you are using Photoshop or other software not (properly) available on Linux. The process of stacking photos is largely automated using this set of scripts, followed by some manual tweaking of colors, sharpness, etc.

                                                                                                        1. 2

                                                                                                          The last time I made an attempt, I bought an app that allowed me to take shots from my tablet. That was a big time-saver.

                                                                                                          I haven’t done the stacking yet. So many cool things to do, so little time, you know?

                                                                                                      2. 1

                                                                                                        What’s your astrophotography setup look like? (I see the postprocess comment to Daniel below, but what kind of camera/scope/etc do you have?)

                                                                                                        I recently got a pretty nice Dobs (big upgrade from my walmart special) and have been thinking about trying some astrophotography (I’m in a short drive to a lot of pretty dark areas from which to shoot). I figure on starting with just a phone mount, if only because the space is so big and so full of rabbit trails to chase I haven’t been able to get a foothold on what I actually need. :D

                                                                                                        1. 1

                                                                                                          I am using the following setup:

                                                                                                          • Scope: William Optics Zenithstar 61 + WO Field Flat61 field flattener
                                                                                                          • Mount: Star Adventurer Pro
                                                                                                          • Tripod: some 10-ish year old (but still decent) Vanguard tripod, without the pan/tilt head
                                                                                                          • Camera: Nikon D700, unmodified
                                                                                                          • Binoculars: Nikon A211 10x50, mostly used for plotting a course for my telescope as I don’t have a goto mount

                                                                                                          The total kit (including tripod and counterweight) weights around 5 KG, so it’s quite portable. This is important as I do not own a car. The cost (excluding camera and tripod) was around €1200, which for telephotography is quite affordable.

                                                                                                        2. 1

                                                                                                          Nice work! My two telescopes are both dobs (that I built a while back), and I experimented briefly with astrophotography by taking high resolution videos of objects as they moves across the FOV in the scope (since the dob mounts are alt/az and don’t track). It’s OK for planets and other bright objects, but I have never tried capturing something as faint as Andromeda.. I may have to give that a try soon (assuming I remember how to do all that, it was a fairly involved process as you allude to..)

                                                                                                        1. 3

                                                                                                          I typically keep my laptop connected to the wall 24/7. Only my latest laptop have I cared about battery because I’ve become sick of 25 minutes of battery when disconnected.

                                                                                                          The e403sa by Asus is the best laptop I’ve purchased. The price is incredible.

                                                                                                          1. 1

                                                                                                            So you basically have a desktop with terrible ergonomics & a tiny screen?

                                                                                                            1. 1

                                                                                                              They could use an external display + mouse/keyboard. It wasn’t a horrible idea in the past, when laptop sizes were easily smaller than desktops, but now with things like the NUC from intel, etc, you can get a desktop computer that is substantially smaller than any laptop.

                                                                                                              1. 1

                                                                                                                15 inch screen at 1080p is actually pretty nice, and I use an ergonomic wireless keyboard and mouse.

                                                                                                            1. 14

                                                                                                              I think some disclosure is in order. This guy, @jonathansampson, works for Brave. He has another account, @BraveSampson, which links to this one, but not the other way around. They used to have a nearly-identical pictures, and, IIRC, linked to each other, but not anymore.

                                                                                                              Would I be the only one to find it fishy for someone to post such reviews for your competitors whilst pretending that you’re an individual not on a payroll from Brave? Why should Mozilla proxy requests to Google through their own servers like Brave does? (Why is Brave (MITM?) proxying requests to Google?)

                                                                                                              Having multiple Twitter accounts is not against the rules if each account is for a separate purpose, but for someone working in the browser industry to be having two separate accounts where they write about browsers on each one, all whilst hiding their affiliation and pretending to be an unaffiliated individual on one of them?! Seriously?

                                                                                                              Keep in mind that Brave and Chrome are the ultimate privacy violators, as it’s not possible to disable autoupdates on either one; Brave developers repeatedly disregarded community’s complaints about this issue (ironically, going against https://brendaneich.com/2014/01/trust-but-verify/); so, you’re basically running a self-modifying binary, whether you like it or not. Any review anyone does is kinda meaningless, because there aren’t any versions per se, and it can do whatever the hell it wants the next day, without any public record of what it did yesterday. With Mozilla, there’s a public ftp directory with all the versions at ftp.mozilla.org — haven’t seen anything like that for neither Brave nor Chrome.

                                                                                                              In fact, many folks used various official guides from Google to disable Chrome from autoupdating itself, e.g., because the newer versions broke font support or other system-level features, only to find such officially-sanctioned settings completely ignored down the line.

                                                                                                              How about doing a review of how much it costs in roaming fees to have Chrome/Brave download updates without your permission whilst you’re travelling?

                                                                                                              1. 5

                                                                                                                His ‘analysis’ of Brave on first start is a good example of how his interests are definitely in conflict with the message he is trying to project in his analysis of competing browsers: https://mobile.twitter.com/jonathansampson/status/1165391211999518720

                                                                                                                Everything is proxied through Brave (and that’s somehow a good thing?), including downloading of the Tor extension and HTTPS everywhere extensions. That seems like a terrible idea.

                                                                                                                1. 4

                                                                                                                  Wow, that’s, like, indeed, even worse. As pointed out, we should use the proper terms for “proxying” here — Brave browser is performing a MITM attack on its own users, and somehow this “individual” that’s hiding his Brave affiliation promotes such as the absolute best practice. Absolutely unbelievable!

                                                                                                                  1. 2

                                                                                                                    I don’t see why it matters. Your browser vendor is a part of your TCB by necessity; if Brave wants to send you malware, tampering with Google Safe Browsing seem like a very roundabout way of doing so, compared to just, you know, shipping a malicious update. Your browser vendor is also inevitably involved in things like push notifications and extension updates.

                                                                                                                    1. 2

                                                                                                                      Which is why it’s best to disable autoupdates, and download updates directly from something like ftp.mozilla.org — not an option with Brave.

                                                                                                                      1. 2

                                                                                                                        You’re still running arbitrary code provided by your browser vendor. The only difference is that you’re delaying your browser updates, which is kind of irresponsible.