1. 1

    Got this from a lwn.net comment which is around improving mutt:

    https://www.codeblueprint.co.uk/2016/12/19/a-kernel-devs-approach-to-improving.html

    1. 6

      So… this is privilege escalation on all Windows versions since XP and it is currently unpatched?

      I don’t know about you, but I run binaries from the internet every workday. I’m not talking about FOSS, either. “Web-based” screen-sharing/conferencing applications that require downloading and executing an .exe come to mind.

      Update: To be clear, some conferencing solutions require each user to download a unique .exe each time you join a conference, not just once to install something..

      1. 2

        Seems there is a patch already, see https://twitter.com/taviso/status/1161297483139407873

        1. 2

          I don’t know about you, but I run binaries from the internet every workday. I’m not talking about FOSS, either. “Web-based” screen-sharing/conferencing applications that require downloading and executing an .exe come to mind. Update: To be clear, some conferencing solutions require each user to download a unique .exe each time you join a conference, not just once to install something..

          That sounds like it can’t possibly be secure unless you either trust the people creating this software or you run them in throwaway-VMs. And I wouldn’t trust people creating software that asks you to run random EXEs all the time…

          1. 1

            It’s Cisco.

        1. 14

          There’s more than just a privacy issue with the way Google uses data. It only took me watching a single climate explainer video on YouTube to start getting straight-up climate denier videos in my recommendations. YouTube is contributing to polarisation and deception on a mass scale.

          Privacy aside, vendor lock-in aside, even monopolistic tendencies aside, the core issue is that Google (and other companies) are allowed to collect data and use it in damaging ways, both on individual and societal level. Data ownership and collection needs to be handled in a completely different way.

          1. 4

            “single climate explainer video on YouTube to start getting straight-up climate denier videos in my recommendations.”

            It drives me nuts. Gotta be the dumbest algorithms in existence. I mostly get repeat videos as the suggestions. When clearing history, I get some different stuff. I plan on experimenting with my new VPN to see if different servers on an empty browser change up the content in interesting ways.

            One friend of mine, not s technical person, just searches for and clicks on random content every day to keep feed diverse. Got me wondering about at least running queries and clicking for different categories of things I like. I mostly just stopped using Youtube for content discovery, though. I just look for specific things.

            1. 2

              Yes, I often see a video I watched literally a minute ago pop up in the recommendations. At this point, though, I have to think that it somehow drives more views – the algorithm can’t be that bad. It’s not at all what I want, but that’s irrelevant of course.

              1. 2

                See Guillaume Chaslot ex-YouTube recommendations engineer about this: https://twitter.com/gchaslot/status/1094359564559044610

                Also Tristan Harris spoke about the YouTube recommendations this week: https://www.vox.com/recode/2019/5/6/18530860/tristan-harris-human-downgrading-time-well-spent-kara-swisher-recode-decode-podcast-interview

            1. 1

              For LastPass it seems this analysis if for the desktop app only? You can use the browser extension without it so I wonder if they share the same issues…

              1. 2

                It’s all been downhill there since I left.

                1. 2

                  Does that mean you’ve chose the Windows laptop instead? ;-)

                  1. 1

                    Ha! I probably will switch at some point. I’m trying to put that off as long as I can.

                  2. 0

                    Hear, hear!

                  1. 32

                    It took until I reached the end of the article for me to realize that the title wasn’t sarcastic. It’s not the first time I’ve seen it but it’s a reminder that this feature is simultaneously fantastic and horrifying.

                    1. 5

                      I didn’t even realize until I read your comment. I skimmed it, saw the screenshot of Takeout, and was like “ah yes, another guide for moving off of Google services…” and closed the tab.

                      1. 2

                        Not sure why are you calling this as horrifying. Is it because google have access to all your data? In Btwn I’m the author of this blog.

                        1. 5

                          Is it because google have access to all your data

                          Yes but in particular the location history is an extremely detailed log of your activity. You could imagine a dishonest corporation or government misusing this data to suppress dissidents or manipulate individuals.

                          1. 4

                            Mobile phone companies had that data for a decade longer than Google. Don’t think that if we got Google to stop collecting it somehow the problem of tracking has been solved.

                            1. 2

                              Regarding the detailed log of your activity, have a look at this video from Forbrukerrådet Norge:

                              “Google manipulates users into constant tracking” https://www.youtube.com/watch?v=qIq17DeAc1M

                            2. 3

                              A more accurate title would have been “Thank you Google for sharing some of my data with me.” It’s horrifying because it gives a glimpse of how much power they hold over their users, and how easily this power can be abused.

                          1. 18

                            All of these are actually things that you can do on new hardware as well.

                            I emulate a number of different user experiences from my workstations (I have a laptop and a desktop, both of which are top end). I have a few different VMs that I use to emulate:

                            • old phones
                            • old operating systems
                            • computers with very limited resources
                            • whatever else we decide to test

                            I can fully empathize with users that have limited hardware, without being limited by the hardware that I use for myself. I’m able to not just empathize - I can actually spin up a machine with the exact same specifications and try to experience things exactly like the end user who is having an issue.

                            It’s not bad to have older hardware, and I think it’s incredibly important for servers to be appropriately specced, but I don’t think that there is actually much benefit to having an old workstation other than not spending money, which is a sufficiently significant reason on its own.

                            1. 44

                              The problem with this approach is that poor hardware is a costume you put on for an evening. For many people, poor hardware is their life. Unless you’re constantly enduring the limitations of poor hardware yourself, you’re going to be constantly making life worse for those who do.

                              There’s a huge difference between immediately feeling the pain of your poor code before you even commit it, and hearing about it from a ticket from your user.

                              1. 27

                                Unless you’re constantly enduring the limitations of poor hardware yourself, you’re going to be constantly making life worse for those who do.

                                To be frank, I could not disagree more.

                                If I hire someone to do a kitchen renovation, and they show up with a hammer, a screwdriver and a handsaw and that’s it, then I’m going to start thinking about if that person is the right choice for the job. It’s not that you need more than a hammer, screwdriver, and handsaw to do the renovation; it’s that there are a bunch of newer tools that make doing that work much easier.

                                poor hardware is a costume you put on for an evening

                                Not at all; in fact, I’d argue that I’m probably more rigorous about testing poor hardware than you are. You are limited to testing your particular configuration of poor hardware, and you probably write things to address the issues that you have, but those are not universal experiences. How do you test things when you have, say, a hard requirement for IE6 that is run with no network connectivity? I’d imagine that you probably just don’t, because you can’t actually test that.

                                Again, I’m not saying that it’s bad to have old hardware, or that new hardware is a necessity for being a good developer, and I certainly agree that appropriately speccing your servers is of utmost importance, but I don’t think that poor hardware is actually a bonus. It’s a way you have elected to live, and that’s fine, but it doesn’t mean that you’re automatically giving more consideration to people who themselves have bad hardware.

                                1. 10

                                  To be frank, I could not disagree more. If I hire someone to do a kitchen renovation, and they show up with a hammer, a screwdriver and a handsaw

                                  That analogy just doesn’t make sense. It ignores exactly the point you’re responding to: the use of old hardware so you can see how well your code runs on old hardware.

                                  How do you test things when you have, say, a hard requirement for IE6 that is run with no network connectivity?

                                  How is that a hardware requirement? Testing lack of network connectivity doesn’t require special hardware, and IE6 isn’t hardware.

                                  It’s a way you have elected to live, and that’s fine, but it doesn’t mean that you’re automatically giving more consideration to people who themselves have bad hardware.

                                  It certainly does. It means you’re automatically giving more consideration than the average developer to people who themselves have bad hardware. It doesn’t mean it’s impossible for other people to also test on bad hardware, but it means they’re doing just that: testing on bad hardware. They’re not living bad hardware. And the vast, vast majority of developers don’t test on bad hardware at all, so you’re automatically a few steps ahead of them.

                                  1. 5

                                    To be frank, I could not disagree more. If I hire someone to do a kitchen renovation, and they show up with a hammer, a screwdriver and a handsaw

                                    That analogy just doesn’t make sense. It ignores exactly the point you’re responding to: the use of old hardware so you can see how well your code runs on old hardware.

                                    I’m not sure it’s particularly helpful, but a more accurate kitchen-fitter analogy would be something like if you were looking to get your tiny cramped kitchen refitted, and find yourself looking round a kitchen showroom or brochure full of enormous, spacious show-kitchens. You wouldn’t question their ability to fit kitchen units, but you might wonder whether they will be able to come up with a plan that makes the best possible use of the limited space available.

                                    1. 1

                                      A good analogy, dancing around the concept of dogfiod

                                  2. 5

                                    You have a great point. I’ll also add that your method of using better boxes allows you, if you choose, to run more analyses and verification of the code. That’s one of reasons I have an i7 now. That’s why some shops have entire clusters of machines. I think I read Intel uses around a million cores to verify their CPU’s. Mind-boggling.

                                    If multicore, you can also run IDE’s, VM’s, code generators, and V&V in parallel. That lets you reduce time to feedback on each work item. Then, you can iterate faster. Maybe even stay in mental state of flow for long time.

                                    1. 4

                                      You have a great point.

                                      Thanks! I was honestly taken aback by the fact that almost all the comments I received were supportive of the idea that the quality of the tools defines the quality of the artisan, which I think is entirely wrong.

                                      Having access to local VMs, IDEs, generators, etc. running parallel to each other is incredibly convenient, and I really just don’t buy into the notion that having a good computer makes you unaware or uncaring that bad computers exist.

                                      1. 6

                                        It does cause that problem for a lot of people. It doesn’t for everyone. It’s more evident when I read the Hacker News thread on a person’s experience with dial-up on modern sites. The comments made it clear a lot of people didn’t realize (a) how many folks had bad connections (dialup or mobile) and (b) how much worse their practices made their experience. Like you, there were some who were aware using techniques like rate limiting to simulate those connections. Most weren’t.

                                        So, it is a risk worth bringing up. That doesn’t mean it will apply to everyone though. Any artisan who cares about achieving a result in a certain context will do whatever it takes to do that.

                                        1. 3

                                          I really just don’t buy into the notion that having a good computer makes you unaware or uncaring that bad computers exist.

                                          It follows from the fact that if you don’t even notice that the code you’re writing runs poorly because your computer is quick, you’re unlikely to improve it until someone with slow hardware tells you it’s slow code.

                                      2. 1

                                        I’d argue that I’m probably more rigorous about testing poor hardware than you are

                                        Good for you, but this is not the usual behavior of at least 90% of developers around.

                                        1. 3

                                          this is not the usual behavior of at least 90% of developers around.

                                          That’s an interesting number because it dovetails nicely with Sturgeon’s Law (tl;dr: 90% of everything is crap).

                                          I don’t think that having a bad machine makes you think more about how people interact with your software. It makes you think more about how a particular subset of users interact with your software. Specifically, it limits you to thinking about how people like you interact with your software.

                                          This is a particular problem for a lot of people. I don’t want to harp on the author because I believe that they’re a great developer that delivers great software, but a specific example of the limitations that they introduce seems to be that they haven’t thought about how https://sr.ht looks for people who have large monitors; the site is pushed off to the left. I think this is partially because he does not or cannot use a large monitor with his setup, so it’s not something he tests for. I think this limitation is the result of the technology he is using and the philosophy of why he chooses to use technology in this way; he favours a particular set of users because those users have a similar situation to how he has chosen to use technology. This is not the only issue with sr.ht that I (or others) have identified, and I think that sr.ht is developed more for Drew and people like Drew than it is for everyone. Edit: Drew corrected me on this. I’m out to lunch here.

                                          It’s important to note that I think that is is okay that sr.ht is aimed at people like Drew; I think sr.ht is fantastic, I think that Drew is a great developer, and I am very happy that sr.ht exists. I’ve told dozens of people about it, and hopefully some will convert to paid users. It’s just not a service that works with my workflow, because I have chosen a different approach to things. The concern for me is that Drew seems to be convinced that developers like me are worse developers because we don’t do things the way he does. I think that’s clear from what he has stated even within this thread, and I think it’s an unfortunate position to hold.

                                          1. 3

                                            Aside: for the record I have thought about those people https://todo.sr.ht/~sircmpwn/sr.ht/112

                                            I think you should stop reading so much into my comments. If I didn’t say it, it’s not what I meant, and I never said that you were a worse developer for disagreeing with me.

                                            1. 2

                                              Unless you’re constantly enduring the limitations of poor hardware yourself, you’re going to be constantly making life worse for those who do.

                                              There’s a huge difference between immediately feeling the pain of your poor code before you even commit it, and hearing about it from a ticket from your user.

                                              What you wrote is pretty straight forward, but I added the inference that bad developers make life worse for other people and good developers don’t. If that’s not what you mean, then my apologies. To me, it looks like it’s what you wrote.

                                              for the record I have thought about those people

                                              Thank you for considering those with wider screens - the last time I saw it brought up, it was dismissed. My sincere apologies for spreading that misinformation, and I’ve edited my parent comment accordingly.

                                              If I didn’t say it, it’s not what I meant, and I never said that you were a worse developer for disagreeing with me.

                                              I’m not trying to dump on you or put words in your mouth. I’ve made some inferences from what you’ve said, and it seems like they’re unfair, so I’d love to understand your position more, but I’ll understand if you don’t reply.

                                              My stance is that a good developer is not defined by his tools. We can create software whether we’re using an 11-year old laptop, a souped up current generation MBP, or a whiteboard and notebook because these tools don’t define our ability to create things. We all have tools at our disposal to try to generalize from our experience to the experiences of all the people that might use our software, but the important thing is that we are making that generalization and trying to accommodate the people who have different experiences from our own.

                                      3. 9

                                        Also, even if you simulate poor hardware and test your code on it, there is probably a temptation to say, “It’s a bit slow, but that’s to be expected on low end hardware”.

                                        As someone who uses ‘poor’ hardware every day, you know how well other applications performing tasks of similar complexity work: It’s less easy to justify poor performance if you’re used to, e.g. a web browser, performing much better on the same hardware.

                                        1. 13

                                          even if you simulate poor hardware and test your code on it, there is probably a temptation to say, “It’s a bit slow, but that’s to be expected on low end hardware”.

                                          That’s not necessarily an unacceptable solution. It depends on the parameters of the project.

                                          I recently completed a project that had super fast sprints and spent no time on optimization. The project was 100% internally facing for the client company; every person had the same hardware, and it ran fine on the hardware. I don’t think I’m a worse coder because we took advantage of that, or because we didn’t spend the time to optimize for other setups. Based on the requirements, we delivered exactly what was needed, for exactly the use case that they had. On the opposite end of the spectrum, we are also currently working on a project that is confined to using Windows XP, IE6, and has very limited network connectivity. If I had an 11-year old laptop, I probably couldn’t adequately test for that scenario; I would have to procure other hardware to actually work on that, and working on it would be more painful than it already is.

                                          My whole point simply is that if you’re good at what you’re doing, the quality of the tool that you’re using is less important than the fact that you are good at what you’re doing.

                                          1. 5

                                            I don’t think I’m a worse coder because we took advantage of that, or because we didn’t spend the time to optimize for other setups.

                                            Nobody is suggesting that you are. Most people develop software for a very wide range of devices, especially for mobile software where the range of speeds in common use is vastly wider than the range of speeds of PCs in common use. This topic is clearly focusing on that, not on people developing software intended to be run on one particular piece of hardware that is known ahead of time.

                                            1. 2

                                              The author of the article did not merely suggest it, he stated it in no uncertain terms it in this very comment chain:

                                              Unless you’re constantly enduring the limitations of poor hardware yourself, you’re going to be constantly making life worse for those who do.

                                              I think this unambiguously says that if you use good hardware, you make bad software.

                                              Also you said this:

                                              Most people develop software for a very wide range of devices, especially for mobile software… This topic is clearly focusing on that,

                                              The author also specifically says this:

                                              I’m talking about making end-user applications which run on a PC.

                                              So I think it’s possible that we’re all talking at cross purposes.

                                              My entire point is that the tool does not define the skill of the developer. One can build great software whether using a $5K MacBook Pro or a $150 Chromebook, and the challenges that are faced are different for each.

                                        2. 4

                                          If your code takes X-times longer to compile just because you want to know it feels and learn from it then go ahead but it’s definitely useless. A VM gives you exactly the same experience + the choice to take a break from it.

                                          1. 11

                                            Incremental compilation is a thing. I spend way more time thinking and editing than I spend compiling.

                                            And if your low-end setup is something you want to “take a break from”, you’ve missed the point.

                                            1. 3

                                              How is using the target system as your main one better than choosing the best one available (or fitting your development needs?)

                                              Are we all going to start making Android apps on Android or make a game directly on a PS4?

                                              You work on whichever system makes the development most efficient (for you) and then test the resulting application on the target platform(s).

                                              Only my opinion, of course.

                                              1. 8

                                                Making Android apps is a miserable fate I wish on no one. And yeah it’d be neato to make games on a PS4. But these are both beside the point and I think you know it - I’m talking about making end-user applications which run on a PC.

                                                1. 1

                                                  Hehe, it would be fun for sure :P

                                                  Anyway, even making games or using certain desktop frameworks can take some time to properly build.

                                            2. 0

                                              The idea that you are using a programming language that takes noticeably long to compile while lecturing someone else on their choice of tools is pretty laughable.

                                              1. -4

                                                I won’t even enter into a discussion with you. Have a great day.

                                                1. -6

                                                  What a pointless comment. Either respond to my comment or don’t. Making a comment to tell me you aren’t going to respond is just stupid.

                                          2. 10

                                            but I don’t think that there is actually much benefit to having an old workstation other than

                                            I personally would add environmental reasons as well (taking into account the newer hardware and what would happen to the old hardware)

                                            1. 5

                                              I kept using older hardware for three reasons:

                                              1. Assess the efficiency of my software.

                                              2. Make sure it will run fast on anyone’s machine like Drew.

                                              3. Help small-time sellers and reduce waste by buying good, used products.

                                              For model-checking and stuff, I did recently need a beefy machine vs what I had before. Plus, I had to consider all the slow-downs that will add up as new vulnerabilities are found in CPU’s. I also wanted support for many FOSS OS’s. So, I bought a Thinkpad 420 refurbished with Core i7. Been feeling really middle class with these boot and run times.

                                              1. 3

                                                You can do 1 & 2 on new machines. I would argue you can do it better than you can on old hardware. For most projects I do, I test on a variety of virtual machines that emulate old hardware, so I know how well it runs. Let’s say you have a use case where you’re doing a project for a company and half the company runs three different operating systems across four different hardware configurations; I can test all of those pretty reasonably without getting 12 different work stations.

                                                The third option is a great one, but I would argue that it’s a subset of the “saving money” option.

                                                1. 2

                                                  Reducing waste is not about saving money

                                              2. 1

                                                I didn’t think of that, but to be fair, I have a whole storage area in my house for old hardware, going all the way back to the Aptiva we got in 1995. Ecological concerns are certainly important!

                                              3. 9

                                                All of these are actually things that you can do on new hardware as well.

                                                You can’t buy a new thinkpad with a 4:3 or 16:10 aspect ratio as far as I can tell. It’s also getting difficult to find models where the battery is easily swappable. Plus optimizing for thinness means nearly all modern laptop keyboards have very shallow, uncomfortable key actuation.

                                                I keep saying “I’ll upgrade when they make a new laptop that’s actually better than the one I have in ways I actually care about” and they keep not making one.

                                                1. 2

                                                  You can’t buy a new thinkpad with a 4:3 or 16:10 aspect ratio as far as I can tell.

                                                  I think that you’re absolutely right with your underlying point - if you have a specific hardware configuration and not having that is a dealbreaker for you, then that’s also a good reason for old hardware. That said, I think it’s a tradeoff - I’ve happily traded some of my keyboard preferences for a more powerful machine. I disliked the new MacBook Pro keyboard quite a bit when I got it, but I have found that the tradeoffs are sufficient enough that overall I enjoy the machine.

                                                  And admittedly, this is less of an issue if you have a desktop workstation - you can choose the monitor configuration or keyboard that you want.

                                                2. 5

                                                  I will add Drew’s strategy has one benefit over yours in that he’s forced to use low-end hardware non-stop. That can reduce folks’ tendency to cheat. I like your approach, too, though. :)

                                                  1. 5

                                                    Cheating this system definitely happens. Not to get super sidetracked, but this is actually part of the project management axes of restraint: cost, quality, time, scope. This level of testing is covered by the “quality” aspect; if a client values high quality, then we do more testing of these sorts of things. If quality is lower on the scale, we may omit rigorous testing.

                                                  2. 3

                                                    All of these are actually things that you can do on new hardware as well.

                                                    I don’t think the author ever implied that these were things you can only do on older hardware.

                                                    I’m able to not just empathize - I can actually spin up a machine with the exact same specifications and try to experience things exactly like the end user who is having an issue.

                                                    For some cases, yes, but VMs are not 100% accurate in terms of the underlying hardware they emulate/virtualize, so for some work (e.g. reproducing customer issues in a wayland compositor) they are not really useful.

                                                    1. 2

                                                      I don’t think the author ever implied that these were things you can only do on older hardware.

                                                      Here’s another quote from the author:

                                                      Unless you’re constantly enduring the limitations of poor hardware yourself, you’re going to be constantly making life worse for those who do.

                                                      I think they’re clearly saying that older hardware makes one a better coder. I think that is not a good way to think. To be clear, I’m certainly not saying that new hardware makes one a better coder. I think that the two things just aren’t related.

                                                      VMs are not 100% accurate in terms of the underlying hardware they emulate/virtualize, so for some work (e.g. reproducing customer issues in a wayland compositor) they are not really useful.

                                                      Absolutely correct - VMs aren’t a magical answer to every single problem. There are definitely cases where they’re not useful, and I’d go further and say there are even edge cases where they’re not only not-useful but actively misleading! They’re still very useful, and especially useful in testing for widely used devices in the sort of situations that are outlined in this article (low power laptops, old phones, multiple operating systems).

                                                      1. 4

                                                        I think they’re clearly saying that older hardware makes one a better coder.

                                                        That brings me to another risk: older hardware can force one to use development practices that are sub-optimal. As in, you can do stuff during development that don’t have to slow down the release versions. Using older hardware could, in theory, limit what a person does in development or have a negative effect on how released app is implemented.

                                                        This is my theory about C where BCPL and C authors made a lot of decisions based solely on the limitations of their hardware that are biting people to this day. People with much better hardware were designing languages with safety, modules, metaprogramming, better concurrency, and so on. So, the hardware limited their productivity, correctness/safety/security, and maintainability.

                                                        1. 2

                                                          And C, until very very recently, was the only language you knew could run on everything.

                                                          1. 1

                                                            You mean was capable of in theory or had compiler support already?

                                                            1. 2

                                                              Had compiler support already

                                                              1. 1

                                                                Yeah, that is kind of funny. I think the support of many of those machines happened due to its strong ecosystem of talent and tools. It was already preferred thing for bare-metal efficiency. Why not adapt it to the new cruddy devices. Talk about going back to its roots, though, for what appear on the surface to be the same reasons.

                                                                On the technical side, it’s amusing to note it was the only thing that could run on ESDAC but better designs could run on today’s embedded systems (i.e. 32-bit ones). Missed opportunity. Well, a bunch of people and companies ported other languages. They’re just super-super-niche. Astrobe Oberon comes to mind.

                                                        2. 3

                                                          Unless you’re constantly enduring the limitations of poor hardware yourself, you’re going to be constantly making life worse for those who do.

                                                          I think they’re clearly saying that older hardware makes one a better coder. I think that is not a good way to think.

                                                          AFAICT he’s just saying that low end hardware forces your sympathy with the rest of the crowd that did not buy a high-end laptop this year.

                                                          VMs are fine too, but it’s like using earplugs to pretend you’re deaf.

                                                          1. 4

                                                            VMs are fine too, but it’s like using earplugs to pretend you’re deaf.

                                                            To be fair, muting your software to test how easy it is to use without sound is probably a better option than deafening yourself. Sometimes temporary software solutions are better than permanent ‘hardware’ choices.

                                                      2. 3

                                                        Which virtualization software do you use, and how does it emulate the speed of old computers? Every virt platform can limit the amount of RAM, but how about the speed of disk and CPU so forth? The VM doesn’t really know what host its running on (i.e. is there a HDD or SSD in the host), and I think the common design is to run as fast as possible.

                                                        Speed is multi-dimensional, because your disk, memory, CPU, and network (especially network) may have different slowdowns, and different workloads will obviously be slowed down by different amounts due to this.

                                                        1. 3

                                                          VirtualBox allows you to change the clock speed of the CPU (execution cap) and VMWare allows you to set resources as well. Those are the two that I use with any real frequency.

                                                          With respect to write speeds on the disk, I rarely consider that (I almost exclusively write web software) though we do consider write speeds for one project - specifically, we want to check something writing to an HDD and not a SSD. To do that, we actually use an external HDD.

                                                          Network is what we actually consider the most, and write everything with that in mind. Luckily Network is kind of a firehose situation - if you reduce how much you send, you improve the perceived speed at which you receive it, so we work at always reducing the footprint of all traffic; reduce the number of requests, reduce the amount in requests, reduce, reduce, reduce.

                                                          1. 2

                                                            In terms of network, Chrome dev tools lets you simulate slow network speeds. I find that incredibly useful when writing publicly facing web apps; it’s very easy to forget how much network latency affects user experience when you’re always hitting a local server

                                                            1. 3

                                                              Does Chrome support adding latency now? Last time I used those developer tools (more than 1 year ago now) I think it only supported reducing bandwidth.

                                                              1. 3

                                                                Luckily I live in Canada and rarely have to simulate a poor connection.

                                                                Jokes aside, the Chrome dev tools are great for a lot of things; I’ll point out that Firefox dev tools also have a way of doing the same!

                                                                1. 2

                                                                  Good to know. I use FF as my mian browser, but still use Chrome for webdev. Partly because we’re targeting Chrome (it’s an internal app) and partly because FF still seems to lock up the entire browser on occasion.

                                                              2. 2

                                                                Thanks for the details. That makes sense and I think it’s worlds better than what most people do.

                                                                I was thinking along the lines of estimating the speed of a Raspberry Pi on my Dell Intel Core i7 desktop. I found that although the clock speeds differ by a factor of 5 or so (3+ Ghz to 700 Mhz), the Pi is more like 30-50x slower! (e.g. with a workload of compiling Python)

                                                                Most software companies aren’t targeting the Pi, but they are targeting old ARM phones. I’d be interested to hear solutions for that. I guess VirtualBox only does x86 so it doesn’t apply to that problem.

                                                                I think the Android emulator is based on QEMU? I wonder if it tries to simulate speed too?

                                                          1. 1

                                                            What is a good linux distro if you want a polished, mac-like UX (though not necessarily thematically the same UI) and be somewhat “mainstream-compatible” (debian, ubuntu, etc.) in terms of packages and such? Dual-screen support (triple-screen) for regular screens of various resolutions should be supported and easy to set up.

                                                            I’ve tried a few distros over the years, but in the end I end up working on a mac and using a Windows computer for games. With the advent of Steam Proton and better linux games support in general, it’s getting closer to the point where I can unify the two.

                                                            1. 3
                                                              1. 1

                                                                That, or possibly Deepin.

                                                              2. 1

                                                                there are tons of options but xubuntu would be at the top of my list

                                                              1. 4

                                                                If your problem is Google being greedy for data the solution is fairly simple: get Google off your device. In other words, make sure your devices run - for as much as possible - only code you explicitly allow them to run.

                                                                This can be done with Android. It can not be done with iOS. In both cases you’ll have to contend with the fact that the ‘radio code’ - the blob of binary code which runs the whatever-G radio the device is equipped with - can be used to all sorts of nefarious things and it fairly certain to either contain loads of known bugs or intentionally introduced backdoors for the TLA’s of the world. Apart from that radio code the device will run an operating system and user applications, both of which can be under your control when running an AOSP-derived Android distribution. The device does not need to run any Google-proprietary code to be able to run Android apps (apart from a few which insist on interfacing with Google Play Services).

                                                                You seem to trust Apple to ‘do the right thing’ but you do not have anything to base that trust on other than feel-good statements by the company and its disciples. I trust Apple just as much as I trust Google or any other commercial enterprise. With this I mean to say that I trust them to look out for their bottom line as that is what makes them tick. Google currently has a different perspective on how to get that number as high as possible from the way Apple tries to maximise it but maximise the number they shall. As I don’t trust either of them I do my best to stay away from them as much as I can: no Apple anything, no Google Chrome, no stock Android, no Google apps, no Google services, no Google Play. Still I have a fully functional phone running Android, it just happens to run free software wherever possible, minus that currently unavoidable radio blob that is…

                                                                1. 4

                                                                  trust Apple to ‘do the right thing’ but you do not have anything to base that trust on other than feel-good statements by the company and its disciples.

                                                                  We have a little bit more than that.

                                                                  If Apple got caught violating the trust of users, that bell would ring around the world.

                                                                  I don’t trust either of them.

                                                                  You trust Google. You haven’t read the source code of your phone; it’s like 50GB download last I checked. These builds scripts download more code over the Internet. Nobody can audit that. You also trust the people who made your “custom” android-building toolchain. You trust them (among other things) to identify and remove anything naughty Google has done. Not to mention you trust the guys who made your phone and all the components within. I have no idea who you’d sue with a random Android phone (some distant Chinese company?), let alone with some “custom Android” installed on it..

                                                                  no stock Android

                                                                  Running a custom Android makes you a QA of one. It’s like running Gentoo. You get to learn from nobody’s mistakes but your own.

                                                                  1. 1

                                                                    I haven’t read all source code, only those parts of it needed to port Android to the three devices I ported it to. Other people have read other parts of it, all of them outside of Google. I’m not the only one using this particular custom Android distribution (which started out as Cyanogen but now is called Lineage, parts of which I remove as I don’t need them, more so in the Cyanogen-days when they started messing with their own ‘Cyanogen login’).

                                                                    That bit about running custom Android or Gentoo implying you have a ‘QA of one’ is just plain silly as you will probably understand yourself. Both custom Android as well as Gentoo builds come from the same source - plus or minus a few tailored modifications - and are built using the same tool chain. The results are very similar if not identical (with reproducible builds), except for the modified bits that is. I won’t loose any sleep over the fact that my personal modifications have a ‘QA of one’, just like I don’t loose sleep over the fact that the house I built and live in has a ‘QA of one’, the bread I bake has a ‘QA of one’ or any other fruits of my labour are not certified by some random committee.I trust my own observations well enough, the thing works, it does what I want it to do, it is silent on the network unless I want it to send or receive data, it runs for more than a week on a single battery charge where stock distributions won’t last more than 2 days.

                                                                    1. 1

                                                                      I don’t loose sleep over the fact that the house I built and live in has a ‘QA of one’

                                                                      I live in civilisation though, and didn’t build my own house.

                                                                      I do programming.

                                                                      Some other guy builds houses.

                                                                      The guy that built my house built hundreds of houses, and he had to get trained and certified by a random committee that trained and certified hundreds and perhaps thousands of other guys, and so on.

                                                                      I think him making a mistake that harms me is unlikely, but my civilisation will promises me recourse if he does.

                                                                      I like that. I don’t want to learn how to build houses, since it would certainly take time away from my programming.

                                                                      Other people have read other parts of it, all of them outside of Google.

                                                                      Given the preposterousness of the claim (reading 50GB of anything), I’m not sure I understand what you expect here. I don’t believe you?

                                                                      1. 2

                                                                        Please calm down and think about what you just said:

                                                                        Other people have read other parts of it, all of them outside of Google.

                                                                        Given the preposterousness of the claim (reading 50GB of anything), I’m not sure I understand what you expect here. I don’t believe you?

                                                                        Read again and you’ll see that I stated that other people have read other parts of it, not that other people read all of it. Of course others did read all of it, if only the ones who wrote it in the first place and those who did code reviews but that is besides the point. Also besides the point is the fact that the amount of source used for an Android build is not even close to 50 GB, you might be confused by the size of the repo versus the size of the code used for a single build.

                                                                        But… the thing is that you on the one hand seem to blindly trust Apple - because that is what we are talking about here - without having the ability to so much as peek at the code, while casting aspersion on the idea of building a distribution for your own device ‘because you can not read all the code’. While I’m sure Apple is happy to have customers like you who trust them blindly this does not mean it is the rational thing to do (when thinking about ‘trust’, it can be more rational economically as building your own takes time and effort), certainly not more rational than building your own

                                                                        I think the conclusion to draw here is that you prefer to put your trust in others and look to your civilisation for recourse when those others fail your trust, while I prefer to trust my own instinct and insight and as such like to get hands-on when building things - whether it be software or hardware (from electronics to houses). To each his own, I guess.

                                                                        1. 1

                                                                          I think the conclusion to draw here is that you prefer to put your trust in others and look to your civilisation for recourse when those others fail your trust, while I prefer to trust my own instinct and insight and as such

                                                                          or, it is my own instinct and insight and such where I come to a completely different conclusion: that civilisation has value. Seriously.

                                                                          the thing is that you on the one hand seem to blindly trust Apple - because that is what we are talking about here

                                                                          I trust one party who might fail me, who has a lot to lose, whereas you trust dozens of parties, any of which might fail you, and none of which has anything to lose.

                                                                  2. 2

                                                                    CopperheadOS was a great Android ROM for this. Since the lead developer left the company, I suppose plain AOSP is the next best bet? I’m also looking forward to the Librem5 phone.

                                                                    1. 3

                                                                      If you’re interested in CopperheadOS, you might like this presentation by Konstantin Ryabitsev[1]:

                                                                      Life Behind the Tinfoil: A Look at Qubes and Copperhead (youtube)

                                                                      [1] Director of IT Infrastructure Security at The Linux Foundation

                                                                  1. 3

                                                                    Um. Following this link I got redirected to some kind of spam website, that was blocked by my browser.

                                                                    Edit: clicked through a bunch more times to try and reproduce, got something slightly different:

                                                                    1. 2

                                                                      I’ve seen this on compromised WordPress sites before. If it’s the same as what I investigated previously, they do something like push the spam/ad/etc. to 1% of traffic and that makes it difficult to inspect/discover.

                                                                      1. 1

                                                                        Does it say Comcast in there? Could that be targeted to that connection?

                                                                        1. 1

                                                                          That’s… worrying. It’s a bog-standard wordpress site. What happens if you go to https://zwischenzugs.com?

                                                                          1. 1

                                                                            I clicked through a dozen times and nothing happened. It definitely didn’t happen every time on the original link either.

                                                                            1. 11

                                                                              Looks like it’s a malicious ad coming in. Hard to say which ad network it came from, since the site is loading an obscene number of them…

                                                                        1. 4

                                                                          I really like Ansible and I’m totally going to see if I can use all or part of this tutorial.

                                                                          I bothers me a bit that updating OpenBSD 6.3 -> 6.4, per the official documentation, requires booting the installation media to preform the upgrade. In the world of could providers and VMs, I want to put together a guide to attempt to do this semi-inplace with a single reboot.

                                                                          I’m glad I read all the release notes before attempting anything though. The OpenSMTPD configuration grammar has changed entirely. I’m going to have to redo all my work in a VM to make sure it all still works.

                                                                          1. 4

                                                                            The big issue with “in place” upgrades in the OpenBSD world is that there is no guarantee the ABI between X.Y and X.Y+1 will be the same. This can cause all sorts of issues while doing in place upgrades. For example, tar, once replaced by the updated binary could segfault for every subsequent call. This would leave the system in an unknown state.

                                                                            I wrote an upgrade tool a while back (snap) that could be used to upgrade from release to release. The ABI issue was hit every couple of releases, so I removed the option to upgrade releases.

                                                                            I am not saying it’s impossible.. just that you will basically have to backup everything prior to doing an install.

                                                                            1. 1

                                                                              Following -current a in-place upgrade mostly just works, but I also always keep a new bsd.rd ready in case in-place fails, reboot into bsd.rd and upgrade will fix it.

                                                                              But I also have switched to a script that downloads sets, patches bsd.rd and reboots. Much less hassle and minimum downtime.

                                                                              1. 1

                                                                                This. I do the same - download bsd.rd, add an auto_upgrade.conf file to the image, then use that bsd.rd on all my systems to upgrade them. Just copy the patched bsd.rd over /bsd on the target, reboot, wait a few minutes, and the box is back up on the new release. I wrote my own script ages ago, but nowadays the upobsd port can take care of the patching bsd.rd bit.

                                                                            2. 2

                                                                              I want to put together a guide to attempt to do this semi-inplace with a single reboot.

                                                                              There is such a guide in the official upgrade notes. The only reason it suggests two reboots is KARL.

                                                                              1. 1

                                                                                Back in the day, I just had a script that downloaded, extracted sets and install a new bsd. Then I rebooted it and ran another script that took care of etc changes and new users, cleanup up old files no longer needed (according to release notes). Last step was just a pkg_add -U.

                                                                                I didn’t run into issues but I was aware of the risks and being on my own when it broke ;-)

                                                                              1. 12

                                                                                I agree with Jono’s Bacon take [1] on it and this sums it up for me:

                                                                                His post today is a clear example of him putting Linux as a project ahead of his own personal ego.

                                                                                Also the full code of conduct [2].

                                                                                [1] https://www.jonobacon.com/2018/09/16/linus-his-apology-and-why-we-should-support-him/

                                                                                [2] https://git.kernel.org/pub/scm/linux/kernel/git/stable/linux.git/tree/Documentation/process/code-of-conduct.rst?id=8a104f8b5867c682d994ffa7a74093c54469c11f

                                                                                1. 4

                                                                                  I found it a great read as well and his blog has more. Thank you for submitting this.

                                                                                  1. 1

                                                                                    I see there you’ve made a note that this was posted to Schneier’s blog, can you share that post link as well? Thank you.

                                                                                    1. 1

                                                                                      That’s where I posted all my early design sketches and essays. There were a bunch of people in software and hardware like Clive Robinson that gave great peer review and debates. We had a meme, “you heard it first on Schneier’s blog,” where news reports, CompSci papers, or new products would echo what we already discussed.

                                                                                      Replacing subverted and/or low-quality Intel chips was something we discussed repeatedly way before Meltdown/Spectre with people like Clive using MCU’s for guards. I kept telling people about VAMP and Leon3 CPU’s which should block many attacks. I ended up just posting an exhaustive list here. RobertT in that discussion is mixed-signal, hardware specialist that spends much of his time obfuscating or reverse engineering ASIC’s. My analog, attack predictions were just rehashes of kind of stuff he was seeing or doing on a daily basis. That man almost single-handedly made me stop believing computers could be trusted. Clive and I recommend pencil, paper, and old school methods for high-security these days with high-assurance security as just risk reduction.

                                                                                      1. 2

                                                                                        Thank you. I wasn’t aware that this had taken place some time ago.

                                                                                        1. 1

                                                                                          The root problems were discovered around 1992. Security community just ignored it all like everything high-assurance, security community did. I had a rant on that here whose main article is a comment with the links to that work. We knew about cache- and microarchitecture-based leaks in 1992. I’ve been recommending mitigation for a long time. Well, mitigation attempts haha. Mainstream security often ignores stuff done out of their own circles or standards. Politics. There’s plenty of work out there waiting to be used or improved on, though. I post a lot of it here since there’s smart programmers here with unusual quality & security focus.

                                                                                    1. 5

                                                                                      Reminded me of FOAF: http://www.foaf-project.org

                                                                                      1. 2

                                                                                        Now that takes me back…

                                                                                        1. 2

                                                                                          I remember being so excited about FOAF when I learned about it around 2005. Those were the heady days of blogs and RSS feeds and open APIs.

                                                                                          1. 2

                                                                                            There’s a little bit of tinfoil-hattery going on in that article, but I don’t think he’s totally wrong. The Internet has matured to the point now where most of the walled gardens are about as big as they’re going to get, so the only growth potential left is destroy the community gardens. It’s not at all unlike Ford and GM’s deliberate nationwide dismantling of public transportation throughout the 20th century.

                                                                                          1. 3

                                                                                            In progress:

                                                                                            Recently finished:

                                                                                            • Building E-commerce Applications, a complete waste of money and basically just a lazy compilation of undedited blog posts. Booooo.

                                                                                            • Come and Take It: The Gun Printer’s Guide to Thinking Free, by Cody Wilson of Defense Distributed fame. I finished this probably a week before the current kerfluffle started. There’s a whoooole lot of self-congratulatory bullshit and bluster in this, as Wilson is first and foremost (in my opinion) an attention whore, but buried in there are a couple of good reflections on the role of toolmakers in the pursuit of independence.

                                                                                            • Come as You Are, a delightful book by Emily Nagoski that I heard about through OhJoySexToy (webcomic about sexual health and practices). It covers a lot of interesting academic information about sex, attraction, and romance, and can help in debugging certain failure modes of relationships or in preemptively being a better partner.

                                                                                            1. 3

                                                                                              buried in there are a couple of good reflections on the role of toolmakers in the pursuit of independence.

                                                                                              We cannot be free until we control the means of production? That sounds like a good reflection, all right :-)

                                                                                              (Note: this may sound like I’m trying to rile you. I’m not, I am genuinely amused to see Marx echoed in this unexpected context.)

                                                                                              1. 4

                                                                                                As the good Chairman once said, “Political power grows out of the barrel of the gun…”.

                                                                                                A lot of Marxists, communists, and libertarians I think would actually have a lot to talk to each other about if they weren’t so busy engaging in culture war these days.

                                                                                                1. 3

                                                                                                  It isn’t too surprising, since all three sprang from the same philosophical tradition.

                                                                                                  A funny aside: a friend of mine recently noted, with regard to economics, we’re all Marxists now.

                                                                                                  1. 3

                                                                                                    Yup! Certain groups don’t really like to think about it, but because Marx did the first serious systematic analysis of how economies worked on a global scale (and coined the word “capitalism”, although contrary to popular opinion he did not coin but merely redefined “communism”), all modern economics owes a debt to Marx at least as big as the one it owes to Von Neumann. Even those opposed to Marx’s conclusions are using methods he pioneered to fight them. (Or, to be more direct: “economics begins with Marx” / “Karl Marx invented capitalism”)

                                                                                                    1. 2

                                                                                                      You might like this recent podcast episode from BBC Thinking Allowed: Marx and Marxism: https://www.bbc.co.uk/programmes/b0b2kpm0

                                                                                              2. 3

                                                                                                Come and Take It: The Gun Printer’s Guide to Thinking Free, by Cody Wilson of Defense Distributed fame. I finished this probably a week before the current kerfluffle started. There’s a whoooole lot of self-congratulatory bullshit and bluster in this, as Wilson is first and foremost (in my opinion) an attention whore, but buried in there are a couple of good reflections on the role of toolmakers in the pursuit of independence.

                                                                                                This was on my reading list; but, after I did the ’ol Amazon “Look Inside,” I took it off because it looked like the signal/noise would be unacceptable. Please give a shout if it ends up being worthwhile. I watched a few of his pre-DD/early-DD lectures on philosopy, and the guy gave me stuff to chew on.

                                                                                                1. 2

                                                                                                  So, again, having finished it I think the same points could be handled in a pamphlet instead of the drawn-out narrative Wilson attenpts.

                                                                                                  1. 1

                                                                                                    Thanks for humouring my obviously lacking reading comprehension skills. 🤦🏾‍♂️

                                                                                                  2. 1

                                                                                                    Lectures on philosophy? Had no idea he was into that, mind sharing some links?

                                                                                                    1. 2

                                                                                                      Cody Wilson Philosophy, Part I is the first of a two part series.

                                                                                                      Why I printed a gun is short and sweet; but, doesn’t get too deep.

                                                                                                1. 2

                                                                                                  This is really a non-issue as far as I’m concerned.

                                                                                                  Browsers (either standalone or with plugins) let users turn off images, turn off Javascript, override or ignore stylesheets, block web fonts, block video/flash, and block advertisements and tracking. Users can opt-out of almost any part of the web if it bothers them.

                                                                                                  On top of that, nobody’s twisting anybody’s arm to visit “heavy” sites like CNN. If CNN loads too much crap, visit a lighter site. They probably won’t be as biased as CNN, either.

                                                                                                  Nobody pays attention to these rants because at the end of the day they’re just some random people stating their arbitrary opinions. Rewind 10 or 15 or 20 years and Flash was killing the web, or Javascript, or CSS, or the img tag, or table based layouts, or whatever.

                                                                                                  1. 10

                                                                                                    Rewind 10 or 15 or 20 years and Flash was killing the web, or Javascript, or CSS, or the img tag, or table based layouts, or whatever

                                                                                                    Flash and table based layouts really were and, to the extent that you still see them, are either hostile or opaque to people who require something like a screen reader to use a website. Abuse of javascript or images excludes people with low end hardware. Sure you can disable these things but it’s all too common that there is no functional fallback (apparently I can’t even vote or reply here without javascript being on).

                                                                                                    Are these things “killing the web” in the sense that the web is going to stop existing as a result? Of course not, but the fact that they don’t render the web totally unusable is not a valid defense of abuses of these practices.

                                                                                                    1. 3

                                                                                                      I wouldn’t call any of those things “abuses”, though.

                                                                                                      Maybe it all boils down to where the line is drawn between supported hardware and hardware too old to use on the modern web, and everybody will have different opinions. Should I be able to still browser the web on my old 100 Mhz Petnium with 8 Mb of RAM? I could in 1996…

                                                                                                      1. 12

                                                                                                        Should I be able to still browser the web on my old 100 Mhz Petnium with 8 Mb of RAM?

                                                                                                        To view similar information? Absolutely. If what I learn after viewing a web page hasn’t changed, then neither should the requirements to view it. If a 3D visualization helps me learn fluid dynamics, ok, bring it on, but if it’s page of Cicero quotes, let’s stick with the text, shall we?

                                                                                                        1. 5

                                                                                                          I wouldn’t call any of those things “abuses”, though.

                                                                                                          I think table based layouts are really pretty uncontroversially an abuse. The spec explicitly forbids it.

                                                                                                          The rest are tradeoffs, they’re not wrong 100% of the time. If you wanted to make youtube in 2005 presumably you had to use flash and people didn’t criticize that, it was the corporate website that required flash for no apparent reason that drew fire. The question that needs to be asked is if the cost is worth the benefit. The reason people like to call out news sites is they haven’t really seen meaningfully new features in two decades (they’re still primarily textual content, presented with pretty similar style, maybe with images and hyperlinks. All things that 90s hardware could handle just fine) but somehow the basic experience requires 10? 20? 100 times the resources? What did we buy with all that bandwidth and CPU time? Nothing except user-hostile advertising as far as I can tell.

                                                                                                          1. 2

                                                                                                            If you wanted to make youtube in 2005 presumably you had to use flash and people didn’t criticize that

                                                                                                            At the time (ok, 2007, same era) I had a browser extension that let people view YouTube without flash by swapping the flash embed for a direct video embed. Was faster and cleaner than the flash-based UI.

                                                                                                            1. 1
                                                                                                            2. 2

                                                                                                              I’d say text-as-images and text-as-Flash from the pre-webfont era are abuses too.

                                                                                                        2. 7

                                                                                                          On top of that, nobody’s twisting anybody’s arm to visit “heavy” sites like CNN. If CNN loads too much crap, visit a lighter site.

                                                                                                          Or just use http://lite.cnn.io

                                                                                                          1. 2

                                                                                                            nobody’s twisting anybody’s arm to visit “heavy” sites like CNN

                                                                                                            Exactly. It’s not a “web developers are making the web bloated” problem, it’s a “news organizations are desperate to make money and are convinced that personalized advertising and tons of statistics (Big Data!!) will help them” problem.

                                                                                                            Lobsters is light, HN, MetaFilter, Reddit, GitHub, GitLab, personal sites/blogs, various wikis, forums, issue trackers, control panels… Most of the stuff I use is really not bloated.

                                                                                                            If you’re reading general world news all day… stop :)

                                                                                                          1. 14

                                                                                                            Microsoft lets you download a Windows 10 ISO for free now; I downloaded one yesterday to set up a test environment for something I’m working on. With WSL and articles like this, I thought maybe I could actually consider Windows as an alternative work environment (I’ve been 100% some sort of *nix for decades).

                                                                                                            Nope. Dear lord, the amount of crapware and shovelware. Why the hell does a fresh install of an operating system have Skype, Candy Crush, OneDrive, ads in the launcher and an annoying voice-assistent who just starts talking out of nowhere?

                                                                                                            1. 5

                                                                                                              I’ll give you ads in the launcher – that sucks a big one – but Skype and OneDrive don’t seem like crapware. Mac OS comes with Messages, FaceTime and iCloud; it just so happens that Apple’s implementations of messaging and syncing are better than Microsoft’s. Bundling a messaging program and a file syncing program seems helpful to me, and Skype is (on paper) better than what Apple bundles because you can download it for any platform. It’s a shame that Skype in particular is such an unpleasant application to use.

                                                                                                              1. 3

                                                                                                                It’s not even that they’re useful, it’s that they’re not optional. I’m bothered by the preinstalled stuff on Macs too, and the fact that you have to link your online accounts deeply into the OS.

                                                                                                                I basically am a “window manager and something to intelligently open files by type kinda guy.” Anything more than that I’m not gonna use and thus it bothers me. I’m a minimalist.

                                                                                                                1. 2

                                                                                                                  I am too, and I uninstall all that stuff immediately; Windows makes it very easy to remove it. “Add or Remove Programs” lets you remove Skype and OneDrive with one click each.

                                                                                                              2. 2

                                                                                                                Free?? I guess you can download an ISO but a license for Windows 10 Home edition is $99. The better editions are even more. WSL also doesn’t work on Home either. I think you need Professional or a higher edition.

                                                                                                                1. 2

                                                                                                                  It works on Home.

                                                                                                                  1. 1

                                                                                                                    Yup. Works great on Home according to this minus Docker which you need Hyper-V support for.

                                                                                                                    https://www.reddit.com/r/bashonubuntuonwindows/comments/7ehjyj/is_wsl_supported_on_windows_10_home/

                                                                                                                2. 1

                                                                                                                  I always forget about this until I have to rebuild Windows and then I have to go find my scripts to uncrap Windows 10. Now I don’t do anything that could break Windows because I know my scripts are out of date.

                                                                                                                  It’s better since I’ve removed all the garbage, but holy cats that experience is awful.

                                                                                                                1. 2

                                                                                                                  From what I understood, this doesn’t apply to Apple’s FileVault. Mostly metadata leaking from previewing images from other encrypted drives like Veracrypt.