1. 104
    1. 31

      yikes, such an blatant violation of the privacy policy — i guess i’m not recommending arc anymore

      1. 14

        It sounds like the kind of thing where an engineer took a shortcut to get a feature working while completely ignoring the privacy implications of the shortcut, but that just goes to show that their internal culture doesn’t have enough of an emphasis on privacy for (literally) The Browser Company. Yikes indeed.

        1. 22

          They also still don’t offer any bug bounties - the $2k for this bug was an exception and the same bug (remote UXSS) in Chromium would’ve fetched ~15x the amount. This alone is the main reason I haven’t given Arc a chance in the past, a modern web browser is an extremely valuable target and I would not want to daily-drive a browser that gives so little incentive for ethical security research.

          1. 13

            The CTO said on the orange site they’re going to launch a bug bounty program, doing it is a reaction to the bad press.

            They also wrote a blog post that’s only accessible via direct link.

            None of this makes it a redemption Arc.

            1. 2

              Reading the post and thinking of the response and everything, I feel like Arc is taking this seriously.

              This feels like the definition of scar tissue forming. Obviously the best thing would be to have the serious mindset from the outset, but given that the team isn’t that big…. this isn’t shocking to me.

              1. 1

                I saw that but they’ve “fixed” the blog / direct link thing now. There was a couple hours delay for it to show up on the blog index page.
                I don’t know if it is due to a technicality of how they run their blog or if it was done intentionally (I like to give benefit of the doubt).

              2. 1

                I’m not clear on why that’s an issue? In general, a company can provide bug bounties of size X for reasons that are not strongly correlated with their commitment towards privacy or security?

                1. 14

                  Because if I as a security researcher find a vulnerability and want to cash in, I can either report it to a bug bounty or sell it on some underground forum.

                  If a bug is worth $30k to Google and $100k to some shady guy on a forum, I’ll give it to Google because I’m still getting a nice sum of money, no trouble with the law, no chance of getting scammed, the ability to speak about my research publicly etc. If I have a similar bug in Arc and they value the bug at $2k, is it really worth taking if I could easily get $100k somewhere else?

                  And bugs will happen, critical vulnerabilities are found in all major browsers surprisingly often.

                  1. 4

                    (Note: my thinking below is entirely divorced from Arc, whom I know nothing about, and care not at all for.)

                    The kind of market dynamics you are referring to are the sort that help create monopolies: established organizations can pay bounties of the “right size”, while new organizations will naturally struggle due to limitations in resources.

                    I don’t understand enough about the motivations of “security researchers”, to the point where I feel compelled to put quotations around the term, but in this context an argumente could be made that they are “extortionists” (in quotation marks in order to acknowledge that it is subjective, and that security researchers are human). For many (e.g. my past self), this would be further proof of the inherent sickness in markets, used to then justify throwing the baby out with the bathwater…

                    …but there’s another way to look at it: a project might rightly choose to focus on building a good initial product, and therefore get enough organic demand for it, so that they can consider expanding on their product (e.g. by providing bug bounties) in the future. Here, we hope that the overwhelming majority of the market can rescue the organization from extortion, and we do not judge the organization negatively for not providing bug bounties.

                    Or, an organization is run by cheapskates who are not interested in building a good product, but merely riding consumer hype with investor money (itself earned through less than delightful means), before fading away into the ether.

                    These are two possibilities amongst many, but I hope they highlight why it seems to me that the existence of a bug bounty program is not correlated with product quality or organizational intent?

                    1. 14

                      The kind of market dynamics you are referring to are the sort that help create monopolies: established organizations can pay bounties of the “right size”, while new organizations will naturally struggle due to limitations in resources.

                      Only inasmuch as any regulation or safety procedure helps to create monopolies. Lifts are more expensive than ladders, but that doesn’t mean we shouldn’t have OSHA.

                      a project might rightly choose to focus on building a good initial product, and therefore get enough organic demand for it, so that they can consider expanding on their product (e.g. by providing bug bounties) in the future.

                      Any project must include an allowance for safety. In a one-person startup, that means reading about security. In a three-person startup, that means hiring a security consultancy. For a company with 50 employees, a $550M valuation, and which has raised $128M, currently hiring for 12 roles, including 4 staff engineers with salaries around $250k USD, I don’t think it’s unreasonable to expect them to spend some serious money on security.

                      Bug bounties are popular not because a lot of C-suites have been captured by extortionists, but because they’re a great way for a company to fund a particular kind of security work: penetration testing. These are generally much cheaper than hiring actual pen testers; pen testing engagement costs tend to start in five figures. But you incentivize folks who like poking around to report their findings directly to you. Sometimes running a bug bounty might be a sufficiently significant risk mitigation that even affects insurance costs.

                      1. 10

                        I would agree with you if we were talking about software with a lesser impact - for example a new website or app with a small userbase. I think Arc’s bounties (or the lack there of) make sense considering their size.

                        However, a web browser is an extremely dangerous program to have a vulnerability in, as it could pretty much expose your entire digital life to someone - I would not take such a risk just because it helps destroy monopolies (and besides, Arc is based on Chromium, are you really defeating a monopoly by using it?).

                        Arc is also not open-source - I believe that if they’re not willing to pay out the bounties themselves they should allow the community to audit the code and submit fixes for it.

                        1. 4

                          These are solid points that help me learn; thank you. And I am in agreement with you now.

                          Taken together with the sibling commenter’s information on Arc’s financials…

                          Yikes!

                      2. 3

                        To be fair the value of an Arc bug is likely well under 15x the value of a Chrome bug given the orders of magnitude difference in install base.

                        1. 3

                          That’s fair, but I don’t think it’d scale linearly. If you’re a bad actor targeting someone and that someone happens to be using Arc, you’d probably still be happy to pay a nice sum for an exploit, even if significantly less than the Chrome equivalent.

                          1. 3

                            Especially given that many popular YouTubers and tech influencers - notably MKBHD and his team - are users of Arc.

              3. 19

                The release notes for the release containing that patch seem to be:

                While we don’t have any flashy new features to show off this time around, rest assured, our teams are hard at work crafting an even more reliable Arc for you.

                Listen to the latest edition of Imagining Arc on Overcast, Spotify, or Apple Podcasts.

                Thanks for using Arc!

                1. 1

                  Isn’t the patch mainly server-side? (Firebase rule, regarding the various access rights)

                  1. 2

                    There is also the client side patch to stop leaking URLs to the server.

                    1. 3

                      AFAICT that change hasn’t actually landed on the client. The post mentions it should come in v1.61.1 but the current latest for mac seems to be 1.58.

                    2. 2

                      Point taken, but I was more getting at, with both their blog and their release notes, they haven’t mentioned the security issue that was in their product.

                      That being said, they now have

                  2. 2

                    That cat is cute

                    1. 5

                      It is niko. Back in the days of windows 95 there was a japanese program you could download that would launch a window where niko would chase your mouse. I remember my dad got it via email…

                    2. 2

                      I mean I totally get why when you build a client side app, you don’t really want to think about the backend at all.

                      Also won’t using Firebase at any sort of scale be debilitatingly expensive?

                      1. 5

                        I used Firebase Realtime Database in prod a little while ago for an SPA with one or two very small server-side endpoints for signing API requests etc. Getting Firebase’s ACL right for the database took effort but could be made to work. The bills were high, as you note.

                        I don’t know if it has changed, but Firebase did have a limitation that gave me some heartburn. You can export your Firebase database to a big json blob and there’s good tooling around automating backups to Google Cloud Storage. It wasn’t well-documented, but when you restore the max upload size was 200 MB. If we ever had to restore prod, we would have had to delete the ACLs, write a script to break the backup into 200 MB chunks, upload them in series, then restore the ACLs. We drafted a script to do this but thankfully never need to use it in prod. I networked my way to Google Cloud engineers to ask if staff had any internal tooling to restore a larger database if we’d ever had to restore, but only got negative responses.

                        I liked Firebase and I’d be happy to use it for auth or syncing small state like notifications where 50-500ms latency is fine. I’d also try to sharply restrict it in the code or other jobs to make sure it didn’t grow into a general-purpose data storage.

                        1. 2

                          Note that this is just Firestore, which can be used without the rest of Firebase.

                          1. 1

                            I mentioned that the ACL was difficult to get right and there’s now a great response exploring those ACL issues. It’s not just keeping in mind the unobvious semantics of request.data vs request.resource.data but the necessity of specifically prohibiting writes to an owner id field or including the owner id in the path. It meant a lot of string munging to build/decompose paths and compare sections. Like this author, I expect that vulnerable ACLs are ubiquitous.