1. 3

    As I understand, this is the web part only. Is there a FOSS cycling/running tracker on mobile thatthatyou can recommend?

    1. 2

      The website should work fine on mobile when it’s done. I wanted to use a web app for everything but unfortunately it’s not possible to keep the device awake to record GPS from a web app so I will be learning Android development and building a full app for it.

      Currently I use OSMAnd for recording and then upload the file using my desktop. I’m not aware of anything better thats foss currently.

      1. 1

        Owntracks is quite good https://github.com/owntracks

      1. 1

        Nice post! As you said, the xray vision is to protect the extensions from evil websites, not the other way around. So to get access to the actual object in the other, you don’t need to break the sandbox. Just ask nicely (and make sure you’re not becoming a confused deputy by trusting untrusted values)

        To get access to the actual object, just overwrite navigator.wrappedJSObject instead.

        See the how to on sharing data with page scripts

        1. 2

          Thanks! I was actually aware of wrappedJSObject, but this only works with Firefox, right? I took the more complicated approach here so that it would be cross-browser compatible. That’s one of the big advantages of WebExtensions in the first place :-).

        1. 3
          • Saturday: my employer is throwing a big party for family & friends just outside of Berlin, should be fun. Lots of other toddlers, so my little ones will also have a great time.
          • Sunday: Berlin marathon is happening. I’m not actually into running, but the event happens just right at my front door. So, we’ll watch and cheer!
          1. 1

            Am I understanding this right, that they’re basically trying to make virtual DOM implementations obsolete?

            1. 1

              Yes. or more performant in some browsers than in others? 🤔

              Having read the thing twice, I still don’t understand how that interacts with requestAnimationFrame() or rather, what its shortcomings are.

            1. 15

              Aside from the creepiness, this just doesn’t make much sense:

              Microsoft wants to push Edge. Sure. It wants to do that to own the Web, to boost sales of Windows by locking out everyone else using ActiveX NT or whatever New Technology What Only Works On Windows. If it gets Edge adoption up to the numbers MSIE had fifteen years ago, it could make a go of that.

              Except fifteen years ago, the iPhone didn’t exist, Android didn’t exist, and practically nobody even tried to surf the Web on the phones they did have. Microsoft has never had a credible modern smartphone, and the era of Web devs being able to ignore mobile devices went out with Nu Metal and jeans you could hide a watermelon in. Microsoft has to know this, so it knows it has, at most, one half of a Nefarious Plan.

              This almost makes sense, and that troubles me.

              1. 4

                Uhm, you’re discounting how much traffic it will bring to their search engoene bing. AFAIU for search engines traffic is good money.

                1. 1

                  Seems like there would be easier ways to get traffic to a search engine, not to mention ones which wouldn’t make users as angry.

                  1. 10

                    To get people to use Bing, Microsoft has to get people to use Edge. You can’t beat the convenience of typing your search terms right into the URL bar, and most people don’t bother changing the defaults (unless they’re prompted to).

                    And it’s not just search. Google has been using Chrome to push all of their other products, many of which compete with Microsoft’s. There are no saints here, except maybe Firefox, and look where that’s gotten them.

                    1. 6

                      … a decent browser?

                      1. 3

                        I think @notriddle is talking about market share :P

                        1. 1

                          Ah, fair enough.

              1. 3

                mailbox.org - they allow completely anonymous accounts (not a feature I use, but something I like supporting!). They also support SPF, DKIM and all those things you usually won’t find easily with mail providers imho :)

                1. 4

                  there’s an idea floating around to require aa CORS dance for local addresses. https://wicg.github.io/cors-rfc1918/ As always, there’s a problem with backwards compatibility that makes many implementers shy away.

                  1. 6

                    How ridiculous.

                    Here, we propose a mitigation against these kinds of attacks that would require internal devices to explicitly opt-in to requests from the public internet.

                    Or, you know, you could change the web browsers so that they can’t make requests to private addresses from public addresses. If I’m on https://lobste.rs/, I don’t want someone to be able to link to http://192.168.1.1/sendMailTo=spam.target@example.com/ or http://127.0.0.1/deleteAllMyShit/. Those devices should be able to sit on my network and be confident that I’m not stupid enough to let people on my network willy-nilly. And my web browser should enforce the distinction between public and private addresses.

                    CORS is an utterly stupid system that only serves a useful purpose in situations that should not exist in the first place. The idea that your browser will send off your cookies and other credentials in a request to example.com when that request was made from Javascript from a completely different domain like malicious.org is batshit crazy.

                    1. 4

                      so that they can’t make requests to private addresses from public addresses

                      we only have private addresses because of NAT. There are still networks that have public IPv4 addresses for all devices, or have RFC 1918 addresses for IPv4, but public addresses for IPv6. This restriction you propose does not make that much sense.

                      I don’t want someone to be able to link to … http://127.0.0.1/deleteAllMyShit/.

                      This is how “native” OAuth apps work on the desktop. So this is actually used. Oh the horror indeed.

                      CORS is an utterly stupid system

                      Agreed.

                      1. 1

                        we only have private addresses because of NAT.

                        Private IP addresses have nothing to do with NAT.

                      2. 3

                        CORS is essential for using APIs from the frontend. It also let’s you do things like host your own copy of riot.im and still connect to matrix.org.

                        1. 1

                          Maybe a local daemon could be used to automatically log in to websites. Or maybe support message signing/encryption out of browser.

                      3. 3

                        to clarify: We can disallow and forbid all the things, turn the privacy crank up to 11 for all of our users. But most people won’t understand why websites are broken and will then use the other browser, because it just works for them.

                        Whenever we want to improve privacy and security for all users, we need to make deliberate meaningful change. Finding the middle ground is hard. But if we don’t, we do our users a disservice by effectively luring them into using a less privacy-protecting browsers ;-)

                        The alternative, is of course, education and user-defined configuration. We do that too. But lots of people are busy, have other priorities or are resistant to education. It’s not enough to just help the idealists ;)

                        1. 2

                          Is it not possible to make the browsers return an error at the same speed?

                          1. 1

                            this isn’t really possible, as far as I can tell. Unless you made every error take 10 seconds and rejected anything that took more than 10 seconds, it’s an unacceptable solution.

                            1. 1

                              Whats wrong with just holding quick fails for 10 seconds before returning and failing anything that takes longer than 10 seconds to reply?

                              1. 1

                                it’s just such a long time to wait.

                                1. 1

                                  A hard value of 10 seconds would probably be too much, and it would not work anyway. The main problem is that the attacker can distinguish between error types using time measurements (whether its 3ms or a static 10s). Instead what you want is to delay one error time to take a similar amount of time to the other - maybe you could pick a random delay based on previous errors of the same type.

                                  This kind of mitigation approaches - at least for network times - are not that different from working on a really slow network. I dont expect the speed focused browsers like firefox/chrome to add this kind of thing. But maybe one of the more privacy aware spin-offs could implement this.

                        1. 2

                          Before everyone jumps on the “bad google” hype: A few things here appear a bit odd to me.

                          Within the text the author speculates that another subdomain of his domain could be the reason for trouble (“Now it could be that some other hostname under that domain had something inappropriate”), and then continues to argue why he thinks it would be a bad thing for google to blacklist his whole domain.

                          Sorry to say this, but: If it’s on your domain then it’s your responsibility. If you’re not sure if some of your subdomains may be used for malware hosting then please get your stuff in order before complaining about evil Google. It’s widespread to regard subdomains as something to not care too much about - as can be seen by the huge flood of subdomain takeover vulns that are reported in bugbounty programs - but that doesn’t mean that it’s right.

                          1. 7

                            On shared hosting services it’s pretty common to only have control over subdomains.

                            Think of github as a modern days example: you have no control over what is served on malware-friends.github.io

                            1. 4

                              Technically, friends.github.io is its own domain, not just a sub domain.

                              github.io is on the public suffix, which makes it an effective top-level domain (eTLD).

                              1. 7

                                Correct me if I am wrong but from what it looks like, this list doesn’t mean anything in terms of DNS and is just a community maintained text file. Does Google actually review this file before marking domains as bad? I really doubt it because then spammers would just use domains on that list.

                                1. 1

                                  Good point!

                                  I was just looking for a familiar example, but actually the PSL might be the root of the issue faced by the author.
                                  It reminds me of the master hosts file originally maintained at Stanford: shouldn’t that info be handled at DNS level?

                                  1. 1

                                    What do I do if I want to make a competitor to GitHub Pages? Do I have to somehow get big and important enough to have my domain end up on the public suffix list before I can launch my service?

                                    What if I want to make a self-hosted GitHub Pages alternative, where users of my software can set it up to let other people use my users’ instance? Do all users of my software have to make sure to get their domain names into the public suffix list?

                                    1. 2

                                      No, you have to spend four minutes reading the (very short) documentation that covers how to get on the list, open a PR adding your domain to their repo, and set a DNS record on the domain linking to the PR.

                                      It might even have been quicker to read the docs than to type out the question and post it here.

                                      1. 1

                                        You do not have to be big. adding yourself to the list is a pull request in which you must bbe able to prove domain ownership.

                                        If you want browsers to consider a domain aan effective TLD, you have to tell them.

                                1. 2

                                  I’ve always found it a bit too slow.

                                  1. 12

                                    I feel very uneasy about the safe browsing thing. Not only it’s opaque and hostile to webmasters, it’s outright anti-competitive.

                                    I’ve seen it blacklist a number of independent file sharing websites, like the late pomf.se, allegedly for distributing malware. Google Drive is not immune to it either, not just because not yet known malware would not be identified, but also due to not running checks on files over certain size, so an ISO image of a game or a livecd with malware embedded in it would be ignored. Same with other big names. I haven’t seen any of them blacklisted though.

                                    It also blocked entire web archiving websites for the same reason.

                                    I could understand if it was a warning like that for untrusted certificates, but it also makes it nearly impossible to access the affected website.

                                    1. [Comment from banned user removed]

                                      1. 13

                                        Please stop spamming lobsters with links to the same blog post over and over again. The article iis about Google’s safe browsing tool, just like the parent comment is.

                                        It seems to me you are bending this towards its literal meaning just as an excuse to link to your blog post.

                                        Hence, I can’t help but call your comment spam. In fact, most of your comments link to the same post.

                                        1. 2

                                          Thanks for sharing your opinion.

                                          I think you got it wrong: I’m not trying to drive attention to my article; I’m trying to inform people interested in web security and its tradeoff (as @dmbaturin seems to be) about a vulnerability that, to my knowledge, affects millions of people, companies and governments.

                                          I’m eager to share on Lobsters more studies and exploits about this technical issue and the cultural problems that it has shown. And I will share them, as soon as more will be written.

                                          For now I’m forced to link my own articles (or the bug report you have closed) despite the risk of being qualified as a spammer.
                                          Fortunately I do not care much about internet points and thus I can be freely downvoted.

                                          1. 5

                                            Fortunately I do not care much about internet points and thus I can be freely downvotes.

                                            In any time period, 90% of commenters receive zero downvotes total. The rest of the users have an exponential distribution and a handful out at the end total many dozen because of both high rates of commenting and high percentage of those comments earning downvotes. In the last two months I’ve been opening private conversations with the handful of extreme users asking them to recognize and reflect on their behavior, because the eventual consequence of not just failing to meet basic community norms but declaring opposition to them has been and must continue to be banning.

                                            Stop riding this ridiculous hobbyhorse through browser threads.

                                            1. 3

                                              I’m neither opposing nor declaring opposition to any “basic community norm” I’m aware of.

                                              I’m not saying I intend to spam, I’m saying that in that particular thread, the reference to my article was a useful (and optional) explanation for @dmbaturin about my argument that “there is not such a thing like ‘safe browsing’” and thus it was not spam despite I was aware that people not liking that article would have downvoted it (as they did: +5, -1 off-topic, -5 spam, -2 troll).

                                              Such argument is a technical one, proved by an exploit that show how any site you visit can tunnel into your private network. You can disagree with my evaluation of its severity, but that doesn’t turn it into spam, or me into a troll.

                                              It’s also on topic, because AFAIK Google Chrome is affected too.

                                              Stop riding this ridiculous hobbyhorse through browser threads.

                                              Why ridiculous?
                                              I never insult anyone here, and yet I get constantly insulted (as spammer, troll, ridiculous, bizarre).
                                              I do not care much, but I’d like to understand why you do so!

                                              I’m neither a troll, nor a spammer.

                                              I try to obey to the rules of the communities I join. And to their administrators.
                                              After our private exchange, I even refrained to ask @freddyb to inform Firefox users about the risks they are facing! Or even just to say if Firefox users are vulnerable to these attacks!

                                              Fine, I will not cite this set of vulnerabilities on Lobsters again.
                                              TBH, I think that having taboo topics will hurt the quality of this site, but your server your rules.

                                              Still I would really appreciate if you could explain here why a vulnerability that let you tunnel into a corporate network (and to carry many other attacks to users’ privacy and security) is ridiculous.
                                              It’s an honest question, and I promise I will not reply further, whatever you will write.

                                              1. 5

                                                I’m neither opposing nor declaring opposition to any “basic community norm” I’m aware of.

                                                Downvotes are part of how community norms are expressed here. When you ignored the scores of people telling you for months that your comments are inappropriate with downvotes and comments, a mod repeatedly intervening in your discussions and messaging you is an unambiguous warning that you are violating norms. You absolutely can’t or won’t take any of this to heart and wave it all away as internet points or a failure to divine your intentions?

                                                1. 4

                                                  You absolutely can’t or won’t take any of this to heart and wave it all away as internet points or a failure to divine your intentions?

                                                  No, evidently I was not clear enough.
                                                  (sorry if I reply, but you are asking me a direct question and you didn’t answer in this comment my question about the browsers’ vunlerability that I promised to not reply to, so I suppose I have to answer)

                                                  Whenever I get downvoted here, I read again the topic to understand if I got something wrong. Some downvotes I got here were well deserved and I think they teached me about what Lobsters is about.
                                                  For example the off-topic downvotes to the posts here and here or the incorrect downvotes here, here and here (I still think that the inability to access to the required information is what defines a partition, but I learnt to be more careful with author names, and a different perspective on the CAP theorem, there).

                                                  An interesting lesson I learnt are the 3 troll downvote here that waere deserved not because my argument was incorrect, but because I didn’t stick to the tone of @friendlysock.
                                                  I’m always careful to preserve the exact same tone (polite, ironic or sarcastic) used by the people to which I reply to, and in that specific comment, I didn’t. Sorry friendlysock, please accept my sincere apology.

                                                  Most of times however, I receive downvotes that does not seem to comply with the Lobsters’ Downvote Guideline. When this happens I usually do not get offended but I do not care much, since the community ufficially refuse them.

                                                  To my eyes, most of these downvotes that seem not compliant with the Lobsters Guidelines are usually on comments that:

                                                  An interesting example to understand why I do not care about such downvotes is the conversation with @cpnielsen and @geocar in this thread about GDPR: I got a total of -3 spam, -3 incorrect and -8 troll in that thread despite providing plenty of informations, links to dwelve deeper and even references to the actual law.
                                                  A few weeks later, I even talked about the topic with a lawyer specialized in IT (that work for an multinational Italian-based bank and was working on its GDPR compliance) and I did show him the thread.
                                                  According to him, I was correct: initially he defined most of the comments I did replied to as either FUD or plain ignorance, but when I made him notice the reference provided by geocar, he agreed that geocar was probably talking about the UK legislation, not the European one.

                                                  Now, as this detailed analysis (that took me almost 4 hours to write) shows, I’m taking this community and its rules by heart.

                                                  But, in all honesty, I think I gave a positive contribution here, despite the downvotes (most of which were not deserved).

                                                  I will go through the CSV you have sent me to further explain my interpretation of the downvotes I got in these mounths as soon as possible. But I’m not sure you are correctly reading the statistics here. Above I have shown you how 98 downvotes were not conforming to the Downvote Guidelines: I do not know how many downvotes correspond to 1 STD here, but how many standard deviations are 98 downvotes?

                                                  Also I think this approach is pretty dangerous to the community itself. An actual troll or a spammer might reduce the deviations of their own downvotes by downvoting others. Also, one should try to look who downvote who, to get a clue about possible attacks from interest groups or cultural biases.

                                                  1. 3

                                                    To give more context for folks wondering: I’ve been discussing Shamar’s commenting style with him for months in public and private after many complains and my own frustrations. I explained how he’s been violating site norms and antagonizing users. Every time he’s re-litigated the technical details of a discussion rather than discuss the pattern of his behavior, even when I’ve repeatedly said that is that thing that needs to be addressed. This particular comment is a response to a private message where, after several rounds of this repetition, I sent him a csv of all of his downvoted comments and invited him to explain the pattern. I banned him not just for this unending antagonism that’s escalated into him using Lobsters to troll Firefox developers, but his unwillingness or inability to get out of the details to recognize and improve it, however many times or forms the feedback takes. I wish him luck finding him a community that welcomes his discussion style, because even at the end of this I don’t think he’s deliberately malicious.

                                    1. 2

                                      The author uses a new profile directory to ensure a new browsing session is really new and fresh.

                                      However, if private browsing (or Firefox multi account container tabs) does not work, we consider this a security bug! If you have any insights how or where these bypasses occur, I’d be more than happy to dig deeper.

                                      1. 2

                                        Here’s an example of an article that detects and blocks private browsing (screenshot):

                                        http://www.baltimoresun.com/news/maryland/crime/ph-ho-cf-tessier-wallen-0907-story.html
                                        

                                        I’m avoiding making this a link since it seems certain external referrers will disable the block. I don’t know how they detect private browsing, but I would hope that it shouldn’t be possible.

                                        1. 1

                                          Eww. Indeed, they detect and complain about private browsing on Firefox. (Though they don’t complain about Firefox built-in tracking protection). I’ll forward this to my colleagues.

                                        2. 1

                                          Netflix also blocks private browsing mode I saw, not sure exactly if this was on Chrome or Firefox. Can’t dig deeper, wasn’t on my computer.

                                        1. 0

                                          I noticed you linked to my replies and called them condescending. I did not mean to be condescending. Please accept my apology :)

                                          I find your writings and submissions somewhat interesting, but also quite tiring - mostly because of volume and frequency. Maybe we will meet each other in real life, and then I will find the time to respond to each of your points individually. But at the current rate, I won’t be able to keep up and reply to all of your writings in a timely manner :)

                                          1. 1

                                            Apology accepted. :-)

                                            Please, try to understand my concerns: AFAIK each site (or CDN) that each Firefox user visits could traverse/bypass their firewall and proxy, violate their privacy (cache timing attacks) and so on… leaving no evidences.

                                            If I’m right, Firefox users should know this (and other browsers’ users too, if they are affected).
                                            And they should know if and how you are going to fix this problem.

                                            If I’m wrong, you should just say that (and possibly briefly explain how Firefox prevents such attacks).

                                          1. 4

                                            As far as I know, XP is still commonly found in developing countries. Firefox was the last browser to still support XP, so this is rather unfortunate.

                                            1. 3

                                              Hopefully XP users will fork Firefox to support XP, just like TenFourFox people did.

                                              1. 3

                                                that would seem nice for those people. But having an unmaintained, unpatched, old Microsoft system on the internet will bite them in the end. I’m hoping that people will find something else instead. Maybe a somewhat-usable linux distribution + wine? Maybe something built on ReactOS?

                                                I’m just hoping that we, as humanity, will be able to get rid of Windows XP :)

                                                1. 4

                                                  Yeah, they need to get off Windows entirely. Developing countries just keep making low-cost, usable, Linux boxes for them. Alternatively, a Linux distro that keeps drivers for as much old hardware as possible. I’m not sure if one already does. Each upgrade I do seems to kill something off on a random, old machine.

                                                2. 3

                                                  There is RetroZilla, which supports as far back as Windows95. Though it is based on SeaMonkey, not Firefox.

                                              1. 1

                                                It’s almost impossible for the browser to tell that 192.168.1.1 on this network is different from 192.168.1.1 on that network. What it can do though, is expect some meaningful user interaction with the form input elements, before filling things in. What we call “meaningful” and “user interaction” is a fine line here. And potentially circumvented with clickjacking, but it seems the next logical step to solve this.

                                                The Mitigations section in the article is pretty lame, imho. I think there is an actual technical solution to this.

                                                What are other people’s thoughts?

                                                1. 2

                                                  Abusing ‘by design’ behaviour to attack millions of WiFi networks.

                                                  During a recent engagement we found an interesting interaction of browser behaviour and an accepted weakness in almost every home router that could be used to gain access a huge amount of WiFi networks.

                                                  IMHO this shows (once more) how dangerous is any “accepted weakness” in a mass-distributed artifact.

                                                  I’d say that adopting HTTPS in home routers is the only correct solution, but each router should have its own certificate so that stealing the private key from one router would not reduce the security of the others.
                                                  After all, if you give physical access to your router to strangers enough for them to attack its firmware, you are doomed anyway.

                                                  On a side note: I’m not sure about the effectiveness of the proposed mitigation in the browsers (to avoid automatically populating input fields on unsecured HTTP pages), but for sure it’s an easy to deploy one, while waiting for every router manufacturer to fix their production line.

                                                  1. 1

                                                    sure, the solution from plex media server software would make a lot of sense too: they give you a free subdomain and help you get a certificate for the device.

                                                    But making the router ecosystem change is a lost battle. those devices are cheap and come with lots of other, arguably worse, security problems.

                                                    I think the browser needs to be part of the solution for this specific problem, if you want to see change.

                                                    1. 2

                                                      I pretty much agree except for one point.

                                                      But making the router ecosystem change is a lost battle.

                                                      This assumes that the only battle field available is the market.

                                                      Law can easily fix the technological problem here.
                                                      And it could also fix more severe vulnerabilities, actually. ;-)

                                                      1. 1

                                                        Law is local. Browsers aren’t ;-)

                                                        1. 0

                                                          Funny!

                                                          Do you mean they are above the Law?
                                                          Or maybe that browser developers are?

                                                          I don’t think so.

                                                          The real issue, when competent people do not solve the problems they create, is that other less competent might have to.

                                                          For example: if routers’ manufacturers won’t fix their products by themselves, they will be obliged to in the very moment governments will realize this attack can be used to enter their private networks. Or the private networks of their banks… or their hospitals… and so on.

                                                          1. 1

                                                            No. My point is that laws work very differently in every country, while fixing one browser touches all countries. Nothing more and nothing less.

                                                            1. 2

                                                              Well, this is a very good point!
                                                              Indeed, it’s my own point since the very beginning of this conversation.

                                                              However, it’s nothing we can’t handle with a location-based strategy.
                                                              But do you really want to be forced to?

                                                              That’s why Technology is a continuation of Politics by other means.

                                                              Not just because we are political animals ourselves, but because (as you can see with these routers) software is always the cheapest component to change.

                                                              This give us an enormous power, but also a huge responsibility: with each hack, with each line we write, we remove or we refuse to, we can either make the world better or worse, but we cannot justify the preservation of a broken status quo.

                                                              We cannot look at our own component in isolation and say “everybody does the same! it’s broken by design! it’s too expensive to fix!”. Nor we can delegate to others the fixes of our own faults: for example, while modifying all the browsers would mitigate the severity of this routers’ issue, it’s too naive to think that browsers are the only components caching credentials!

                                                              Doing the right thing is totally up to us. And it is always possible.

                                                1. 10

                                                  Finally, Mozilla should survey its users to find out their attitudes towards moving DNS from their current service provider to Cloudflare. To do so, those users must first be well informed about what such a move would mean. Based on the survey results, an honest consent page can be generated that makes sure users know what they are agreeing to.

                                                  Well put!

                                                  The title is a bit misleading though. Most recent blog post actually says the networking team will look for different settings per region.

                                                  1. 5

                                                    Germany has an equivalent called Freifunk that is quite popular.

                                                    1. 2

                                                      Was it not true that it was illegal to have unsecured wifi in Germany? Or has this been overturned?

                                                      1. 8

                                                        It was never illegal, but the operator was liable for all network activity. That was recently reversed.

                                                        1. 2

                                                          The new law instead allows content owners to force individual wifi operators to block certain web sites from being reachable via their open network. The technical ‘how’ is of course left unspecified. It will be interesting to see how this rule gets applied in practice.

                                                        2. 3

                                                          What tedu says is correct. The liability problem iis gone.

                                                          The official Freifunk firmware dodged that by routing outbound traffic through a VPN though.

                                                      1. 2

                                                        A colleague of mine actually ran (runs?) a project that allows you to put a small interstitial on your page that says “Hey, it seems you’re not running an adblocker”. Quite fun :-)

                                                        His code with a small howto is on github but there’s also longer howto on his blog if you like that.

                                                        1. 1

                                                          Thanks!

                                                        1. 3

                                                          folks seem to be rather nice here.

                                                          I’m also here because this website is less about start-up culture, than others. *coughs*

                                                          1. 7

                                                            I’m glad to see browsers getting serious about removing trackers from the web but I fear this is starting a cat and mouse game with trackers. uMatrix works so well because trackers are on obvious domains like google-analytics. I think if browsers start blocking these 3rd party trackers than websites will move everything 1st party and use webpack to bundle one mega js file that includes the essential stuff and the trackers. They can also proxy all requests and cookies through their own server and then send it off to trackers. None of that is hard but it’s not done because it isn’t needed yet.

                                                            Trackers also have the upper hand in that they can rapidly change without fear of breaking other websites.

                                                            1. 9

                                                              Another reason it isn’t done, from what I understand, is that advertisers don’t trust content providers. They want a third-party to verify that the impressions (or the tracking data) they’re getting are reasonably genuine. I don’t think you’re wrong, but I do think there’s a little more standing in the way of that particular “nuclear option” than laziness.

                                                              1. 2

                                                                In that case there might be a rise of ad networks that bundle their trackers with jquery or some other JS library that is impossible to block without breaking the websites. There is just an insane amount of money in tracking that I don’t think it will be easily shut down.

                                                              2. 3

                                                                At some point, we’ll have to move away from simple block listing based on domains, yup. It’s an arms race, though. I agree. :-/

                                                                1. 2

                                                                  I also worry about the arms race. My hope is this plays out similarly to how the spam wars went over a decade ago— The Good Guys band together and use technology to reduce the baddies to a buzzing noise rather than drowning out a decent mode of communication.

                                                                  1. 12

                                                                    The spam wars did have collateral damage though: it’s a lot harder to host a mail server than it used to be.

                                                                    1. 8

                                                                      We “won” the mail spam war by dodging it. Email recentralized dramatically.

                                                                      IMHO it’s also a loss.

                                                                    2. 1

                                                                      If you’re on a slow connection and disable tracking mostly because it makes your browsing faster, then this doesn’t sound terrible. Ghostery just found 24 trackers on cnn.com (some of which presumably loads other trackers, because if I pause blocking it finds 34). Bundling these into one “mega js” include should actually improve things for people who don’t use any blockers.

                                                                      Regarding proxying requests, I think that would make site owners stop and think a bit more about whether they really need 30+ trackers on their site. I think a lot of these are included because the barrier to entry is so low, so raising it can only be good.

                                                                    1. 3

                                                                      This raises some concerns for me, since umatrix does essentially the same thing (blocking third-party web requests), and it can break some sites until I fiddle with the access controls for that site.

                                                                      1. 2

                                                                        I’m sure there’ll be a way to unblock temporarily, just as the existing tracking protection allows you to do. - or am I misreading your concern?

                                                                        Some of those things you can already test in Nightly and I’m happy to forward your bug reports.