1. 48
    1. 64

      Except that, as far as I can tell, Firefox isn’t produced by a malicious actor with a history of all sorts of shenanigans, including a blatantly illegal conspiracy with other tech companies to suppress tech wages.

      Sure, if your personal threat model includes nation states and police departments, it may be worthwhile switching to Chromium for that bit of extra hardening.

      But for the vast majority of people, Firefox is a better choice.

      1. 13

        I don’t think we can meaningfully say that there is a “better” choice, web browsers are a depressing technical situation, that every decision has significant downsides. Google is obviously nefarious, but they have an undeniable steering position. Mozilla is more interested in privacy, but depends on Google, nor can they decide to break the systems that are created to track and control their users, because most non-technical users perceive the lack of DRM to mean something is broken (“Why won’t Netflix load”). Apple and Microsoft are suspicious for other reasons. Everything else doesn’t have the manpower to keep up with Google and/or the security situation.

        When I’m cynical, I like to imagine that Google will lead us into a web “middle age”, that might clean the web up. When I’m optimistic, I like to imagine that a web “renaissance” would manage to break off Google’s part in this redesign and result in a better web.

      2. 19

        Mozilla also has a history of doing shady things and deliberately designed a compromised sync system because it is more convenient for the user.

        Not to mention, a few years ago I clicked on a Google search result link and immediately had a malicious EXE running on my PC. At first I thought it was a popup, but no, it was a drive-by attack with me doing nothing other than opening a website. My computer was owned, only a clean wipe and reinstallation helped.

        I’m still a Firefox fan for freedom reasons but unfortunately, the post has a point.

        1. 12

          a few years ago I clicked on a […] link and immediately had a malicious EXE

          I find this comment disingenuous due to the fact that every browser on every OS had or still has issues with a similar blast radius. Some prominent examples include hacking game consoles or closed operating systems via the browser all of which ship some version of the Webkit engine. Sure, the hack was used to “open up” the system but it could have been (and usually is) abused in exactly the same way you described here.

          Also, I’m personally frustrated by people holding Mozilla to a higher standard than Google when it really should be the absolute opposite due to how much Google knows about each individual compared to Mozilla. Yes, it would be best if some of the linked issues could be resolved such that Mozilla can’t intercept your bookmark sync but I gotta ask: really, is that a service people should really be worried about? Meanwhile, Google boasts left, right and center how your data is secure with them and we all know what that means. Priorities people! The parent comment is absolutely right: Firefox is a better choice for the vast majority of people because Mozilla as a company is much more concerned about all of our privacy than Google. Google’s goal always was and always will be to turn you into data points and make a buck of that.

          1. 1

            your bookmark sync

            It’s not just bookmark sync. Firefox sync synchronizes:

            • Bookmarks
            • Browsing history
            • Open tabs
            • Logins and passwords
            • Addresses
            • Add-ons
            • Firefox options

            If you are using these features and your account is compromised, that’s a big deal. If we just look at information security, I trust Google more than Mozilla with keeping this data safe. Of course Google has access to the data and harvests it, but the likelihood that my Google data leaks to hackers is probably lower than the likelihood that my Firefox data leaks to hackers. If I have to choose between leaking my data to the government or to hackers, I’d still choose the government.

            1. 1

              If I have to choose between leaking my data to the government or to hackers, I’d still choose the government.

              That narrows down where you live, a lot.

              Secondly, I’d assume that any data leaked to hackers is also available to Governments. I mean, if I had spooks with black budgets, I’d be encouraging them to buy black market datasets on target populations.

              1. 1

                I’d assume that any data leaked to hackers is also available to Governments.

                Exactly. My point is that governments occasionally make an effort not to be malicious actors, whereas hackers who exploit systems usually don’t.

        2. 6

          I clicked on a Google search result link

          Yeah, FF is to blame for that, but also lol’d at the fact that Google presented that crap to you as a result.

          1. 3

            Which nicely sums up the qualitative difference between Firefox and Google. One has design issues and bugs; the other invades your privacy to sell the channel to serve up .EXEs to your children.

            Whose browser would you rather use?

        3. 3

          Mozilla also has a history of doing shady things and deliberately designed a compromised sync system because it is more convenient for the user.

          Sure, but I’d argue that’s a very different thing, qualitatively, from what Google has done and is doing.

          I’d sum it up as “a few shady things” versus “a business model founded upon privacy violation, a track record of illegal industry-wide collusion, and outright hostility towards open standards”.

          There is no perfect web browser vendor. But the perfect is the enemy of the good; Mozilla is a lot closer to perfect than Google, and deserves our support on that basis.

      3. 8

        These mitigations are not aimed at nation-state attackers, they are aimed at people buying ads that contain malicious data that can compromise your system. The lack of site isolation in FireFox means that, for example, someone who buys and ad on a random site that you happen to have open in one tab while another is looking at your Internet banking page can use spectre attacks from JavaScript in the ad to extract all of the information (account numbers, addresses, last transaction) that are displayed in the other tab. This is typically all that’s needed for telephone banking to do a password reset if you phone that bank and say you’ve lost your credentials. These attacks are not possible in any other mainstream browser (and are prevented by WebKit2 for any obscure ones that use that, because Apple implemented the sandboxing at the WebKit layer, whereas Google hacked it into Chrome).

        1. 2

          Hmmmm. Perhaps I’m missing something, but I thought Spectre was well mitigated these days. Or is it that the next Spectre, whatever it is, is the concern here?

          1. 11

            There are no good Spectre mitigations. There’s speculative load hardening, but that comes with around a 50% performance drop so no one uses it in production. There are mitigations on array access in JavaScript that are fairly fast (Chakra deployed these first, but I believe everyone else has caught up), but that’s just closing one exploit technique, not fixing the bug and there are a bunch of confused deputy operations you can do via DOM invocations to do the same thing. The Chrome team has basically given up and said that it is not possible to keep anything in a process secret from other parts of a process on current hardware and so have pushed more process-based isolation.

    2. 29

      Disclaimer: This article covers various things that are NOT right up my alley, so I’ll comment only on some. I’m not going to use my Mozilla Security hat, because I mostly work on other things.

      I know some of the claims are outdated. E.g., the JIT was rewritten and the analysis by Chris Rohlf doesn’t apply anymore. It is true that win32 lockdown and site isolation aren’t fully ready yet, unless you use Firefox Nightly.

      1. 1

        It seems from some of the issues linked from Fission meta bug that Firefox is implementing OOPIF. Is that the case now?

        e.g. https://bugzilla.mozilla.org/show_bug.cgi?id=1698044

        1. 3

          Yes, enabling Fission means that different-site iframes are out-of-process.

          Type “Fission” in the search field in preferences in Nightly to find the checkbox to enable.

          1. 2

            That is fantastic news! Thank you!

    3. 12

      Is this corroborated by vulnerability counts in the respective browsers? TFA links to reports that

      [Firefox] gets routinely exploited by Law Enforcement

      and

      If you are in any way at risk, you should be using Chrome, no matter how much Firefox has improved.

      Be _very_ wary of anyone who tells you that Firefox security is comparable to that of Chrome.

      Which I’m not inclined to doubt these per se, but they’re undefended and I’m not familiar with the people who made those claims.

      1. 4

        Be very wary of anyone who tells you that Firefox security is comparable to that of Chrome.

        Are we talking default installs here? A lot of people use Firefox because one can (and always could) install addons like NoScript, uBlock, etc. I wonder how comparable security is then.

        (Yes, these addons have become available for Chrome as well but I doubt they’re as integrated and pervasive and one is still at the mercy of Google allowing them.)

    4. 12

      This article has a myopic view of security, as if it is a feature that can be added on top of pre-existing code. With the exception of the “Automatic Variable Initialization” section, and the rust section (which is a counter argument), it’s entirely about mitigations that can be added on top of code with the hope of making exploiting bugs harder or (if you are really lucky) impossible, instead of not having such bugs in the first place (or having less of them).

      The article might make a convincing argument that Chrome has better vulnerability mitigation. Does this translate to Chrome having better security? I have no clue - it is highly dependent on what the base vulnerability rate is. That’s a question that the article didn’t even begin to address.

      1. 2

        Web browsers contain millions of lines of code written in C/C++, including shared libraries. They consume untrusted input from random sources on the Internet. It takes a single memory-safety bug to allow an attacker to have arbitrary code execution. If FireFox had half or a quarter of the defect rate of Chrome, these mitigations would still be of critical importance.

        1. 2

          The parent article is not arguing that they’re not important.

          Say Firefox has 40 security vulnerabilities per year and Chrome has 80, because they have a bigger team and add code at a faster rate than Mozilla does. (I’ve picked these arbitrarily because they make the math easy.) Now say that because Chrome’s vulnerability mitigations are better, 50% of its security bugs are unexploitable in practice (whether because they’re actually unexploitable or because it’s simply too difficult), whereas only 25% of Firefox’s are. 400.25 = 10, and 800.5 = 40. In this case, Chrome is less secure than Firefox, even though it has better vulnerability mitigations.

          Neither I nor parent comment are suggesting that this is true (I completely made up the above numbers), but that it could be true, and that the article does not discuss this. I’ve also ignored vulnerability severity, but the argument still works even if you don’t.

          1. 4

            There are a few things missing from this analysis:

            • How long do bugs remain in the codebase? As a user, I don’t care whether a vulnerability came from a new feature or whether it came from a 20-year-old feature that no one tried to exploit until now.
            • If there is a vulnerability in a particular component, then what is the scope?

            The second is the really crucial one and is the motivation behind compartmentalisation. The smaller the amount of damage that an attacker can do, the less it matters if you have a vulnerability. With the Chromium and WebKit2 models, there’s a per-site process, which runs with minimal OS privileges. The attacker is assumed to have arbitrary code execution in that process. If there’s a vulnerability in this component, it makes life easier for attackers but, by itself, it doesn’t grant the attacker the ability to read or modify any data except that associated with the site that provided the malicious data. Consider the following example:

            • I go to my Internet banking site in one tab and log in.
            • I go to some malicious web site in another tab.

            Assume that there’s a vulnerability that the malicious web site can exploit to gain arbitrary-code execution in the component that is exposed to them. These are large C/C++ codebases with a bunch of third-party C/C++ libraries and so there almost certainly is at least one vulnerability that they can exploit. With a Chromium- or WebKit2-based browser, they then need to find a second privilege-elevation vulnerability to do anything particularly harmful. This requires attacking code in another process over a much smaller attack surface than the renderer process (which parses images, videos, HTML, and runs JavaScript). In Firefox, in contrast, they can immediately access all of the data for your Internet banking site, exfiltrate the data, and (depending on the two-factor authentication requirements of your bank) transfer money elsewhere or trick you into transferring money to a different person when you go to make a payment.

            It doesn’t matter if Chromium or Firefox is adding code (and vulnerabilities) to this compartment at a faster rate, because the damage that buggy code can do is limited.

            That’s not to say that Chromium’s sandboxing is particularly well designed. It relies on secrets, when all of the operating systems that it targets have mechanisms for providing file descriptors / handles to other processes that could be used to implement their sandboxing abstractions in a way that would eliminate the root cause of a large number of sandbox escapes. I suspect this comes from thinking about the problem as an ACL problem rather than as a capability problem. If they’d started with Capsicum and not with chroot, they’d probably have ended up with something more secure.

    5. 22

      The fact that we even need to worry about sandboxing for looking at glorified text documents is embarrassing.

      1. 27

        Your PDF reader also ought to be sandboxed; malicious PDF documents have used to hack people.

        Ideally, your ODT reader also ought to be sandboxed. There have been RCE bugs in LibreOffice where malicious documents could exploit people.

        Reading untrusted user input is hard. Hell, even just font parsing is fraught with issues; Windows’s in-kernel font parser was a frequent target of bad actors, so Microsoft sandboxed it.

        Sandboxing is almost always a good idea for any software which has to parse untrusted user input. This isn’t a symptom of “the web is too complex”; it’s a symptom of “the web is so widely used that it’s getting the security features which all software ought to have”.

        The web is also too complex, but even if it was just basic HTML and CSS, we would want browsers sandboxed.

        1. 2

          Maybe the parent comment should be rewritten as, “The fact that we even need to worry about sandboxing for using untrusted input is embarrassing.” A lot of these problems would be solved if memory-safe languages were more widely used. (Probably not all of them, but a lot.)

      2. 23

        We need to worry about sandboxing for any file format that requires parsing, if it comes from an untrusted source and the parser is not written in a type-safe language. In the past, there have been web browser vulnerabilities that were inherited from libpng and libjpeg and were exploitable even on early versions of Mosaic that extended HTML 1.0 with the <img> tag. These libraries were written with performance as their overriding concern: when the user opens an image they want to see it as fast as possible and on a 386 even an optimised JPEG decoder took a user-noticeable amount of time to decompress the image. They were then fed with untrusted data and it turned out that a lot of the performance came from assuming well-formed files and broken with other data.

        The reference implementation of MPEG (which everyone shipped in the early ’90s) installed a SIGSEGV handler and detected invalid data by just dereferencing things and hoping that it would get a segfault for invalid data. This worked very well for catching random corruption but it was incredibly dangerous in the presence of an attacker maliciously crafting a file.

        1. 8

          when the user opens an image they want to see it as fast as possible and on a 386 even an optimised JPEG decoder took a user-noticeable amount of time to decompress the image.

          Flashback to when I was a student with a 386 with 2MB of RAM and no math co-processor. JPGs were painful until I found an application that used lookup-tables to speed things up.

        2. 6

          The reference implementation of MPEG (which everyone shipped in the early ’90s) installed a SIGSEGV handler and detected invalid data by just dereferencing things and hoping that it would get a segfault for invalid data. This worked very well for catching random corruption but it was incredibly dangerous in the presence of an attacker maliciously crafting a file.

          I think this tops the time you told me a C++ compiler iteratively read linker errors.

        3. 2

          We need to worry about sandboxing for any file format that requires parsing, if it comes from an untrusted source and the parser is not written in a type-safe language.

          But the vulnerabilities in the web are well beyond this pretty low-level issue.

          1. 4

            Bad form to reply twice, but I realise my first reply was just assertion. [Google Project Zero categorised critical vulnerabilities in Chrome and found that 70% of them were memory safety bugs]. This ‘pretty low-level issue’ is still the root cause of the majority of security vulnerabilities in shipping software.

          2. 3

            They often aren’t. Most of them are still memory safety violations.

      3. 21

        The web is an application platform now. It’s wasn’t planned that way, it’s suboptimal, but the fact that it is has to be addressed.

      4. 17

        Considering web browsers to be “glorified text documents” is reductive. We have long passed the point where this is the priority of the web, or the intention of most of it’s users. One cannot just ignore that a system that might have been designed for one thing 30 years ago, now has to consider the implications of how it has changed over time.

      5. 5

        The fact that we even need to worry about sandboxing for looking at glorified text documents is embarrassing.

        Web browsers are now a more complete operating system than emacs.

      6. 4

        The fact that sandboxing is still a typical strategy, as opposed to basic capability discipline, is embarrassing. The fact that most programs are not capability-safe, let alone memory-safe, is embarrassing. The fact that most participants aren’t actively working to improve the state of the ecosystem, but instead produce single-purpose single-use code in exploitable environments, is embarrassing.

        To paraphrase cultural critic hbomberguy, these embarrassments are so serious that we refuse to look at them directly, because of the painful truths. So, instead, we reserve our scorn for cases like this one with Web browsers, where the embarrassment is externalized and scapegoats are identifiable. To be sarcastic: Is the problem in our systemic refusal to consider safety and security as fundamental pillars of software design? No, the problem is Google and Mozilla and their employees and practices and particulars! This is merely another way of lying to ourselves.

      7. 4

        It’s text documents from unknown/untrusted sources over a public network. If you knew exactly what the text document contained a priori you wouldn’t need to blindly fetch it over the internet, no?

        To me the issue is we’ve relaxed our default trust (behaviorally speaking) to include broader arbitrary code execution via the internet…but the need for trust was always there even if it’s “just text.”

    6. 9

      The problem is that web browsers are just as complex (or arguable, more) as operating systems themselves. As an example, compiling Chromium and its dependencies takes multiple times longer than compiling all of HardenedBSD.

      Securing an application that uses (abuses?) “remote code execution” as a feature is nearly impossible. You’re executing someone else’s code, and you don’t even know whose.

      Sandboxing isn’t a security silver bullet. Attackers are interested in way more than popping calc.exe. Flipping a single bit in memory, something sandboxing doesn’t take into account, can be all that an attacker is after. Maybe you’re connected to an HTML5-based IPMI console, and the attacker may want to take control of it while you’re grabbing coffee.

      Additionally, there’s more types of sandboxes than what’s provided by Windows and Linux. On FreeBSD, there’s Capsicum, a capabilities framework commonly used for sandboxing. If we’re going to study sandboxes, we should probably include more than just a glance at the two provided by this article.

      Instead of generic hand-waiving, it would be great to tangibly show how different types of vulnerabilities and attack vectors are (or aren’t) protected against in each browser. Provide proof-of-concept code to demonstrate the discussion at hand.

      1. 2

        A web browser is an operating system with a compiler, renderer, and a host of other components. How long does it take to compile hardenedbsd, llvm, mesa, gtk?

        Well, sure; llvm is way overkill. We could instead use pcc and tinygl. Just the same as we could use quickjs or duktape in place of v8. And indeed, I think there’s a real argument to be made that llvm is overengineered and the world would be a better place if we instead used qbe or firm. But the complexity of chromium is not, I think, greater than the totality of the operating system that hosts it.

        1. 4

          It takes just around 1.5 hours to build HardenedBSD on my laptop. By contrast, it takes well over twenty-four hours to compile Chromium on the same system. Modern browsers have their own scheduling algorithms, memory management systems, etc. They really are very quite similar to an OS. See also: javascript rump kernels. You can emulate complete architectures/systems in the browser.

          1. 1

            I don’t disagree. My point is that the scope of the browser is larger than just the core kernel and utilities, so you have to compare it to the kernel and its utilities, along with the compiler, graphics library, widget toolkit, display server, etc…

            And what you will find if you do this is that though the browser is bloated, the environment it replaces is also bloated.

            1. 2

              Which I am, because HardenedBSD is a full OS, not just a kernel.

    7. 7

      So in essence it’d be nice if we could somehow get the whole chromium team to do the same work for firefox with the budget of alphabet.

      It’s the same for linux AFAIK, where windows 10 has some serious sandboxing improvements for its kernel. Looking at some of the discussions regarding the faulty patches, it seems that the majority of kernel memory bugs are still left to automatic analyser solutions.

    8. 5

      I see a lot of posts on Firefox vs Chrome (or in this case Chromium) and it always seems to be people lobbying for others to use Firefox for any number of moral or security reasons. The problem that I see with a lot of this is that Firefox just isn’t as good of a user experience as Chromium-based browsers. Maybe Mozilla has the best intentions as a company, but if their product is subjectively worse, there’s nothing you can really do.

      I’ve personally tried going back to Firefox multiple times and it doesn’t fulfill what I need, so I inevitably switch back to Vivaldi.

      1. 10

        This is really subjective. I tried using ungoogled-chromium but switched back to Firefox. I used Vivaldi for a while but switched to Firefox as well. Before I was using the fork of Firefox called Pale Moon but I got concerned with the lack of updates (due to how small is the team).

        1. 2

          Sure it is, but almost 80% of the world is using a chromium browser right now and Firefox is stagnant at best, slowly losing ground. Firefox even benefits from being around longer, having a ton of good will, and some name recognition and it still can’t gain market.

          1. 8

            It also didn’t get advertised everytime you visit Google from another browser. It also isn’t installed by default on every Android phone.

            1. 8

              Firefox also isn’t installed by default by a bunch of PC vendors.

            2. 1

              Firefox already had its brand established for years before that happened. It’s also worth noting that Microsoft ships with its browser (which is now a Chromium variant, but wasn’t until recently) and doesn’t even use Google as the search engine, so the vast majority of new users don’t start with a browser that’s going directly to google to even see that message.

              1. 2

                And yet they start with a browser and why replace something if what you have already works discounting those pesky moral reasons as if those are not worth anything.

          2. 4

            Among technical users who understand browsers, sure, you might choose a browser on subjective grounds like the UX you prefer. (Disclaimer: I prefer the UX of Firefox, and happily use it just fine.)

            Most people do not know what a browser even is. They search for things on Google and install the “website opener” from Google (Chrome) because that’s what Google tells you to do at every opportunity if you are using any other browser.

            When some players have a soap box to scream about their option every minute and others do not, it will never matter how good the UX of Firefox is. There’s no way to compete with endless free marketing to people who largely don’t know the difference.

            1. 1

              If that were the case, people would switch back to Edge and Safari because both Windows and MacOS ask you to switch back, try it out again, etc every so often.

              The UX of firefox is ok (they keep ripping off the UI of Opera/Vivaldi though fwiw and have been doing so forever), but it functionally does not work in many cases where it should. Or it behaves oddly. Also, from a pure developer perspective, their dev tools are inferior to what has come out of the chromium project. They used to have the lead in that with Firebug, too, but they get outpaced.

      2. 2

        Yeah, I switched to Firefox recently and my computer has been idling high ever since. Any remotely complicated site being left as the foreground tab seems to be the culprit.

    9. 2

      The incomplete process isolation and spectre mitigation is really scary. One tab can read the data of the other tabs, slowly but still. That’s game over for many things, isn’t it?

      1. 2

        It’s the web, web security from a cynic’s point of view has always been broken. I mean you are literally executing RANDOM code on your computer, with zero hope it is safe to do so. Web browsers have thrown a bunch of stuff around trying to make sure at least the code that’s run for a given website is only useful for that given website.

        Web security has been game over, technically, for basically ever. It hasn’t changed adoption, or feature creep, to the point that a web browser is basically an entire OS hiding as a friendly user application, with essentially zero privacy.

        We have things like sub-resource integrity for HTTP now, but nobody uses it, because practically every website in existence loads gobs and gobs of 3rd party code they have zero hope of ever getting verified, because no 3rd party will every knowingly shoot themselves in the foot from being able to update code whenever they feel like it…

        Until websites STOP willy-nilly allowing 3rd party code to run on their website(which they can control with CSP headers), there is little hope.

        Of course basically every popular website fails terribly @ HTTP security headers(see https://securityheaders.com/ and type in your fav. website for proof)

        Of course the optimist perspective is, it’s getting BETTER, and most of the time it’s generally possible to at least be sure the code you are randomly running was authorized/approved by the website you visited, provided the server hasn’t been hacked.

        But right now, we are still trying to get past the low-hanging security fruit of things like XSS(cross site scripting) protection, which is now technically possible to fix, but … well lots of websites still mostly suck at it.

        1. 2

          Web security is hard and untrusted code execution is the hardest part.

          A lot of the things that you mention are true but at least they are fixible if the web site providers cares to do so. But not being able to protect their user’s secrets from other sides if they do everything right, is even worse.

          It means that if I create a minimal website carefuly, any private data can be stolen from another tab by known techniques of the user uses Firefox. There are probably 100,000s of programners that would be able to exploit that, given public resources on how to do that. That is scary to me.

          A lot of exploits at least aren’t public knowledge for months or years before being fixed.

          1. 1

            Sure, but that is not remotely limited to only Firefox. Browser security around running untrusted code is improving and FF might be a bit behind, but it’s not like Chromium is somehow immune.