Threads for Shamar

  1. 12

    I feel very uneasy about the safe browsing thing. Not only it’s opaque and hostile to webmasters, it’s outright anti-competitive.

    I’ve seen it blacklist a number of independent file sharing websites, like the late, allegedly for distributing malware. Google Drive is not immune to it either, not just because not yet known malware would not be identified, but also due to not running checks on files over certain size, so an ISO image of a game or a livecd with malware embedded in it would be ignored. Same with other big names. I haven’t seen any of them blacklisted though.

    It also blocked entire web archiving websites for the same reason.

    I could understand if it was a warning like that for untrusted certificates, but it also makes it nearly impossible to access the affected website.

    1. 2

      Before everyone jumps on the “bad google” hype: A few things here appear a bit odd to me.

      Within the text the author speculates that another subdomain of his domain could be the reason for trouble (“Now it could be that some other hostname under that domain had something inappropriate”), and then continues to argue why he thinks it would be a bad thing for google to blacklist his whole domain.

      Sorry to say this, but: If it’s on your domain then it’s your responsibility. If you’re not sure if some of your subdomains may be used for malware hosting then please get your stuff in order before complaining about evil Google. It’s widespread to regard subdomains as something to not care too much about - as can be seen by the huge flood of subdomain takeover vulns that are reported in bugbounty programs - but that doesn’t mean that it’s right.

      1. 7

        On shared hosting services it’s pretty common to only have control over subdomains.

        Think of github as a modern days example: you have no control over what is served on

        1. 4

          Technically, is its own domain, not just a sub domain.

 is on the public suffix, which makes it an effective top-level domain (eTLD).

          1. 7

            Correct me if I am wrong but from what it looks like, this list doesn’t mean anything in terms of DNS and is just a community maintained text file. Does Google actually review this file before marking domains as bad? I really doubt it because then spammers would just use domains on that list.

            1. 1

              Good point!

              I was just looking for a familiar example, but actually the PSL might be the root of the issue faced by the author.
              It reminds me of the master hosts file originally maintained at Stanford: shouldn’t that info be handled at DNS level?

              1. 1

                What do I do if I want to make a competitor to GitHub Pages? Do I have to somehow get big and important enough to have my domain end up on the public suffix list before I can launch my service?

                What if I want to make a self-hosted GitHub Pages alternative, where users of my software can set it up to let other people use my users’ instance? Do all users of my software have to make sure to get their domain names into the public suffix list?

                1. 2

                  No, you have to spend four minutes reading the (very short) documentation that covers how to get on the list, open a PR adding your domain to their repo, and set a DNS record on the domain linking to the PR.

                  It might even have been quicker to read the docs than to type out the question and post it here.

                  1. 1

                    You do not have to be big. adding yourself to the list is a pull request in which you must bbe able to prove domain ownership.

                    If you want browsers to consider a domain aan effective TLD, you have to tell them.

            1. 4

              In my Copious Free Time (read: after my kids are in college, so ~15 years from now) I’d like to port Oberon to the Raspberry Pi. It would be fun.

              1. 2

                It was one of the ideas I had for Raspberry Pi 3. I keep thinking about buying one. They say it’s best even if less hardware since so much software already works on it. Helps newcomers out. I want an always-on box that doesn’t use much power.

                Oberon port, either Oberon or A2 Bluebottle, was one of my ideas. I also thought about porting it to Rust then backporting it to Oberon. Basically, knocking out any temporal errors plus letting it run without GC. Then, Oberon-to-C-to-LLVM for performance boost. Oberon in overdrive. ;)

                If you wait 15 years on your project, then Wirth’s habits might mean there will be another 5-10 Oberon’s released with minor, feature changes before you get started. He might try dropping if statements or something. Who knows.

                1. 2

                  He might try dropping if statements or something. Who knows.

                  Well actually I would remove the FOR loop given it’s just syntactic sugar for a WHILE.

                  However for some reason it seems that Wirth likes it. :-)

                  Anyway… Wirth is a light source in these dark days: he will always remind us that less features mean less bugs.

                  1. 5

                    less features mean less bugs

                    It can also mean more bugs in the next layer up, if the dearth of features at a particular layer requires people to constantly reimplement even basic functionality.

                    1. 1

                      This is exactly why I countered Wirth’s philosophy. We see it in action already where modern languages can:

                      (a) improve productivity expressing solutions with better abstractions or type inference

                      (b) improve performance with features like built-in parallelism and highly-optimizing compilers

                      (c) improve safety/security with things like better type systems

                      1. 1

                        @jclulow and you propose a good objection, but I think you are confusing easyness with simplicity.

                        I see the features that a language (or an operating system) supports like the dimensions that can describe a program (or a dynamic ecosystem of users interacting with hardware devices).

                        Thus, to me, an high number of features (as many system calls in an OS or many forms in a programming language) are smells of a (potentially) broken design, full of redundancy and features that do not compose well.

                        On the flip side, a low number of features can mean either a low expressivity (often by design, as in most domain specific languages) or a better designed set of orthogonal features. Or it might just be a quick and dirty hack whose sole goal was to solve a contingent issue in the fastest and easiest possible way.

                        My favourite example to explain this concept is to compare Linux+Firefox with Plan 9 (and then Plan 9 to Jehanne, that is specifically looking for the most orthogonal set of abstractions/syscalls that a powerful distributed OS can provide).

                        It’s not just a matter of code bloat, performance, security and so on, it’s a matter of power and expressivity: with few well designed features you can express all the features provided by more complex artifacts… but also a wide new set that they cannot!

                        Simplicity is also an important security feature, in particular for programming languages.

                        1. 1

                          I get what you’re saying. Your view is actually more nuanced than Wirth’s. The main measure of complexity for Wirth was how long it took to compile the compiler. If it took a lot longer, he’d drop the feature. I think longer compiles are fine if they give us something in return. I also favor an interpreter + compiler setup for rapid development followed by high optimization. We have a lot of means to get high correctness regardless of the features in it. I’m seeing all rewards with no losses. Certainly people can put in stupid features or needless complexity which I’d be against. Wirth just set the bar way, way, too low.

                          “with few well designed features you can express all the features provided by more complex artifacts…”

                          Macros, modules, generics, polymorphism… a few examples.

                          “Simplicity is also an important security feature, in particular for programming languages.”

                          You probably shouldn’t be citing that paper given it’s one of the rarest attacks in existence. Compiler optimizations screw up security the most but people always cite Thompson. Anyway, I originally learned security reading work of the guy Thompson ripped off: Paul Karger. I wrote here about Karger’s invention of the problem and how you solve it. It’s a totally-solved problem. For new solutions, we have a page collecting tiny and verified versions of everything to help newcomers build something more quickly.

                          1. 3

                            with few well designed features you can express all the features provided by more complex artifacts…

                            Macros, modules, generics, polymorphism… a few examples.


                            You do not need a compiler to generate code. You just need the compiler to verify the generated code before compilation.

                            Embedding code generation in a compiler (or a runtime, as Jit compiler do) can be convenient, but I’m not sure it’s teally needed.

                            Obviously it’s pointless to restrict yourself from using a supported language feature to pick “the good parts”. I use macros in C for example. But I also generate C in other ways when it’s appropriate and it’s pretty simple and usable. Thus I like Oberon that omit a possible source of complexity just like I like LISP and Scheme that maximise the gain/complexity ratio by raising the gain.

                            You probably shouldn’t be citing that paper given it’s one of the rarest attacks in existence.

                            I stand corrected (and thanks for these references!).

                            But the point I was trying to make was more general: we cannot trust “trust” in technology.

                            And the laymen cannot trust us either, as when they do we fool and exploit them.

                            Thus we need to rebuild the IT world from the ground up, focusing on simplicity from the very beginning. So that fellow programmers can rapidly inspect and understand everything.

                            The main measure of complexity for Wirth was how long it took to compile the compiler. […]
                            Wirth just set the bar way, way, too low.

                            He really doesn’t need my defence, but I don’t think that minimizing compiler’s compilation time is the goal, but just a metric.

                            AFAIK, the reasoning is: if it’s slow to compile (with a decent compiler), it means in could be simpler.

                            In other words, compilation time is a sort of canary that complexity kills for first.

              1. 3


                You’re right. But, your approach is flawed.

                Can you extol the virtues of a non-executable web, instead?

                To @arnt and the commenters on the article, whom talk about tricking users into clicking YES… How exactly is a website going to prompt when the allegorical NX bit is set?

                1. 3

                  Well I think it’s pretty easy to imagine a web without JavaScript:

                  • aesthetically it would be pretty similar to the current one because of lacking JS would give us
                    • better typography
                    • better stylesheets (to be kept NOT Turing complete)
                    • probably more tags (Imagine a tag <slide> or a <tree> tag)
                  • we could use standard XMLNS to enrich the contents
                    • with semantic context
                    • with better document search
                    • with accessibility tips for machines
                  • we would have better forms
                    • with more controls
                    • with a micro language to validate inputs
                  • it would be faster (to download and to render)
                  • it would be safer (obviously, given this class of attacks)
                  • it would be more privacy friendly (JavaScript can even detect if you zoom on text or on a certain image and tell it to Google so that they can advertise better glasses or something)
                    • it could even have an <advertisement> tag to make them less annoying
                  • it would be more stable (for various reasons, lighters tabs, simpler…)
                  • it would be easier to learn for newcomers


                  While I do think that the vulnerabilities we are talking are so severe and dangerous that they deserve the emergency fix I proposed in the bug report, I do not think that we should suddenly remove JavaScript from the web.

                  While it’s true that I do not like the JavaScript language, I just think that people should have the right to choose who can execute custom programs on their device.

                  IMHO, making JavaScript opt-in on a per site basis would not break the Web, it would fix it.

                  Many wouldn’t use such feature and run every script they can reach.
                  But many others would use such freedom and control. And now, they cannot.
                  Instead, their security is at risk. And they are unaware of such risk.

                  1. 2

                    So, you want to load and view documents that are rendered from declarative source code rather than imperative source? I can get behind that!

                    JavaScript can even detect if you zoom on text or on a certain image and tell it to Google so that they can advertise better glasses or something

                    You’re worried that high resolution micro-interaction data (mouse hover, scroll linger, etc) may reveal that I need eyeglasses? :) It may reveal that, but I’m more concerned that it would reveal my favorite color, or favorite body type. This type of data collection is my pet peeve. It’s nobody’s business which images or text I linger over in my idle time!

                    In the distant past of the late nineties, I used to teach fellow travelers that .txt, .jpg, .gif, and .html files are safe and .exe, .com, and .bat files were unsafe. It was a bright line and it helped users be responsible for their own online safety. That browser vendors decided to allow executable scripts inside otherwise declarative documents obliterated that bright line.

                    1. 1

                      So, you want to load and view documents that are rendered from declarative source code rather than imperative source?


                      Whenever possible. And extending what is possible to do this way.

                      You’re worried that high resolution micro-interaction data (mouse hover, scroll linger, etc) may reveal that I need eyeglasses?

                      Exactly. And you should be worried too.

                      Your sexual tastes have some economical value, but only in a few segments.

                      Any information about your health has a huge value in every market.

                      As a Data Science hobbist, I can ensure you that few bits of data about your health can be used to build interesting profiles about a person.

                      Such profiles are gold to insurance companies, banks, potential employers and so on.

                      That’s something people should consider before posting their tracking records online, as over time they let strangers build precise profiles about their health. But it’s possible to get such information from unaware users with any web application or online casual game.

                      It’s easy to trick a person to solve a puzzle that clearly reveals how you see colours or shapes. Or how well you listen.

                      It’s possible even without JavaScript.
                      But JavaScript makes it cheap for anybody to collect such data while you are simply reading a magazine.

                1. 2

                  Opt-in privacy protections have fallen short. […]
                  These efforts have not been successful. Do Not Track has seen limited adoption by sites, and many of those that initially respected that signal have stopped honoring it.

                  That’s because you put the freedom to opt-in privacy protections on the wrong side of the TCP connection.
                  It’s a bit naive to trust the autoregulation of global markets for matters that most people do not understand.

                  By providing a clear set of controls to give their users more choice over what information they share with sites Mozilla does a step in the obviously correct direction. Out of curiosity, are you going to change the Cross-Origin Resource Sharing implementation or the Fetch Living Standards, first?

                  Note however that while this improve the status quo, it does not fix the problem.

                  For example, a CDN or a JS-based analytics service could simply serve to all users from a certain region (or a certain range of IPs) a javascript that connect a websocket and follow the users’ navigations with a good approximation among the web sites that trust them by comparing the connection IP to the Origin header. An even better approximation if they manage to have access to other informations, such as the HTTP requests of the web sites, as any cloud hosting provider or distributed caching service could have (think of CDNs controlled by Google, Amazon, Cloudflare).

                  We leak informations continuously!
                  And such informations can be stored as data.

                  Moreover, once a server is able to identify users (or even just clusters/groups of them), it can serve custom javascript to bypass their corporate firewall and proxy. Or DoS their computer by making it banned from its private network… and so on.

                  Despite the sandboxing, the attack surface is practically unbounded.

                  1. 1

                    A good start would be to block cross domain JS, i.e. stop fetching scripts from another domain and execute them. With something like Decentraleyes CDNs can be replaced and limit tracking and the attack you describe above.

                    Ideally we’d move to a web without JS. Web apps (HTML+CSS+JS) would need to (by default) come from a curated “app store”.

                    1. 1

                      A good start would be to block cross domain JS […]

                      I’m afraid this approach wound not tackle the core issue at all.

                      CDNs would just change a little their operations and make you configure a subdomain to work around the same origin policy. Same for JS-based analytics and so on…

                      Also the adversary might be a phisher posing for a site you trust or even a site you have to use for any reason.

                      Ideally we’d move to a web without JS.

                      Maybe. But I think it’s reasonable to think that opt-in JavaScript will come first.

                      Web apps are currently supplying a legitimate need to distribute the computation among different machines.

                      The problem is that they are built at the wrong layer, on top of something that people consider safer than installling a software, even if such software is indeed downloaded only once, cryptographically signed by the copyright holder and so on…

                      But even when these issues will be fixed, this need will remain and will be served otherwise. I think that mainstream operating systems will be much different from now.

                      1. 2

                        CDNs would just change a little their operations and make you configure a subdomain to work around the same origin policy. Same for JS-based analytics and so on…

                        That’s a good point!

                  1. 3

                    While I basically agree here, the problem is, if you are a new developer and you search the internet on how to build a menu for my website, basically all you will get back is using giant JS frameworks that take up gobs of space, instead of the few lines of CSS and HTML5 you need(without any JS) to actually build a menu. I don’t have a good solution to this, but I see this as a major contributor for why this craziness keeps growing in size.

                    I think it also doesn’t help that when we do get new things like webauthn, but then only get a JS interface to use them. Somewhat forcing our hand to require JS if you want nice things. That doesn’t mean we have to shove 500MB of JS to the user to use webauthn, but we can’t do it with just HTML and a form anymore.

                    1. 8

                      That’s because nobody should need to search the internet for how to make a menu. It’s a list of links. It’s something you learn in the first hour of a lecture on HTML, chapter 1 of a book on HTML.

                      You probably neither need nor want to use webauthn. Certainly not yet! It was published as a candidate recommendation this year. Give others a chance to do the experimenting. Web standards used to take 10 years to get implemented. Maybe don’t wait quite that long, but I’m sure you’ll do fine with an <input type="password"> for a few years yet.

                      1. 3

                        I was just using both as an example, I apologize for not being clear.

                        Yes a menu is just a list of links, but most people want drop-down or hamburger menu’s now, and that requires either some CSS or some JS. Again, go looking and all the examples will be in JS, unless you go searching specifically for CSS examples.

                        This is true of just about everything you want to do in HTML/Web land, the JS examples are super easy to find, the CSS equivalents are hard to find, and plain HTML examples are super hard to find.

                        Anyways, I basically agree webauthn isn’t really ready for production use, but again, both of these were examples, and webauthn just because it’s something I’m currently playing with. You can find lots of new web tech that is essentially JS only, despite it not needing to be, from a technical perspective. This is what I’m saying.

                        1. 3

                          I understand it’s just an example, but that’s my point really: it’s yet another example of something people way overcomplicate for no good reason. ‘It’s the first google result’ just isn’t good enough. It’s basic competency to actually know what things do and how to do things in HTML and CSS, and not accomplish everything by just blindly copy-pasting whatever the first google result for your task is.

                          Web authentication? Sure it’s just an example, but what it’s an example of is people reinventing the wheel. What ‘new’ web technology isn’t just a shitty Javascript version of older web technology that’s worked for decades?

                          1. 2

                            LOL, “overcomplicate for no good reason” seems to be the entire point of so many JS projects.

                            I think we agree more than we disagree.

                            New developers have to learn somehow, and from existing sites and examples tend to be a very common way people learn. I agree web developers in general could probably use more learning around CSS and HTML, since there is a LOT there, and they aren’t as easy as people tend to think on the surface.

                            Well webauthn has a good reason for existing, We all generally know that plain USER/PASS isn’t good enough anymore, especially when people use such crappy passwords, and developers do such a crappy job of handling and storing authentication information. There are alternative solutions to FIDO U2F/webauthn, but none of it has had much, if any success, when it comes to easy to use, strong 2FA. The best we have is TOTP at this point, and it’s not nearly as strong cryptographically as U2F is. I don’t know of any web technology that’s worked for decades that competes with it. Google has fallen in love with it, and as far as I know, requires it for every employee.

                            The closest would probably be mutual/client based TLS cert authentication, but it’s semi-broken in every browser and has been for decades, the UI is miserable, and nobody has ever had a very successful deployment work out long-term (that I’m aware of). I know there was a TLS Cert vendor that played with it, and Debian played with it some, both aimed at very technical audiences, and I don’t think anyone enjoyed it. I’d love to be proven wrong however!

                            Mutual TLS auth works better outside of the browser, things like PostgreSQL generally get it right, but it’s still far from widely deployed/easy to use, even after having decades of existence.

                            That said, I’m sure there are tons of examples of crappy wackiness invented in web browser land. I have to be honest, I don’t make a living in web development land, and try to avoid it for the most part, so I could be wrong on some of this.

                      2. 1

                        Maybe check out Dynamic Drive. I used to get CSS-based effects off it for DHTML sites in early 2000’s. I haven’t dug into the site to see if they still have lots of CSS vs Javascript, though. A quick glance at menus shows CSS menus are still in there. If there’s plenty CSS left, you can give it to web developers to check out after teaching them the benefits of CSS over Javascript.

                        I also noticed the first link on the left is an image optimizer. Using one is recommended in the article.

                        EDIT: The eFluid menu actually replaces the site’s menu during the demo. That’s neat.

                        1. 3

                          An interesting project that shows how modern layouts can be built without JavaScript is W3C.CSS.

                          /cc @milesrout @zie

                          1. 3

                            Thanks for the link. Those are nice demos. I’d rather them not have the editor, though, so I easily see it in full screen. Could have a separate link for the source or live editing as is common elsewhere.

                      1. 0

                        I noticed you linked to my replies and called them condescending. I did not mean to be condescending. Please accept my apology :)

                        I find your writings and submissions somewhat interesting, but also quite tiring - mostly because of volume and frequency. Maybe we will meet each other in real life, and then I will find the time to respond to each of your points individually. But at the current rate, I won’t be able to keep up and reply to all of your writings in a timely manner :)

                        1. 1

                          Apology accepted. :-)

                          Please, try to understand my concerns: AFAIK each site (or CDN) that each Firefox user visits could traverse/bypass their firewall and proxy, violate their privacy (cache timing attacks) and so on… leaving no evidences.

                          If I’m right, Firefox users should know this (and other browsers’ users too, if they are affected).
                          And they should know if and how you are going to fix this problem.

                          If I’m wrong, you should just say that (and possibly briefly explain how Firefox prevents such attacks).

                        1. 6
                          1. They want to move DNS to Cloudflare. I’m sure there’s still hot debates happening on mailing lists. This also means the plan isn’t final.
                          2. They want to change the default behavior. People who care can opt-out. I’d say majority of their users (non tech savvy) don’t even know what DNS is and would benefit from using Cloudflare over their ISP for DNS (Comcast, ATT, Verison come to mind – I’d trust Cloudflare over them anyday).

                          Maybe there’s been too many click-baity articles published already that’s saying Firefox is going to be shoving this down our throats. I see this largely as a net gain, if it happens to pass. But that said, there’s no reason to be outraged. If you’re upset, it means you care. And if you care, you can opt-out. Easy.

                          1. 2

                            You’re still replying from an overly US-centric view though.

                            (disclaimer: I’m a PowerDNS employee)

                            1. 1

                              It’s a pretty common issue. Even outside the US.

                              (disclaimer: I’m European)

                          1. 6

                            Oberon operating system is another homogeneous OS from Niklaus Wirth.

                            It is entirely programmed in the Oberon programming language and each part of the system can be easily modified just like with TempleOS or a Lisp machine.

                            1. 3

                              re homogeneous OS. Exactly! I was thinking it should be first or early on reading it. Then, I thought maybe author just assumed everyone heard about it. I focused on LISP/Smalltalk. It definitely should’ve been mentioned, though, since it competed with the C family of languages for systems programming, was safe, fast compiles, and whole school ran on it at one point.

                              re just like LISP machine. No, see my article. There’s barely any comparison except the simplicity and fast iterations of the base language. Ok, let’s say you had an error in your app which was due to your OS. You could load it up in the IDE seeing the error plus the current state of the app. If it was OS, you could load its internals. Then, you could hot-fix the running system with changes propagated throughout the OS.

                              Far as I know, Oberon was miles away from ever doing that. It’s that kind of thing that made me do counterpoints on Wirth where simplicity, esp for compiler, shouldn’t be main focus. I’d rather have the complexity of implementing a LISP machine if it meant my job as a developer on average project was that much more productive. Likewise, complex compilers like LLVM make code super-fast. I’d like to have both fast-to-compile and fastest-when-running as options. Wirth only wanted former.

                              So, it was homogeneous, easy to program, and easy to compile. It wasn’t near the LISP experience, though. That’s the gap I want folks perfecting imperative languages to close. One of them anyway.

                              1. 2

                                Well, you are right… but such comparison is a bit unfair.

                                LISP hackers were lovely cheaters in this regard.

                                It’s easy to live edit your kernel if you have soldered an interpreter to use as a processor before! :-D

                                1. 2

                                  Haha. Yeah, well, some universities were willing to invest in good hardware vs cheapest crap they can find. Too bad they were rare and/or the prices didn’t come down. Now days, you can put a LISP interpreter on a FPGA. There’s also a x86 kernel for Common LISP in development. I haven’t looked into it deeply, though.

                                  1. 2

                                    Now that you make me think about it, I do remember to have seen an FPGA based machine for Oberon-2, for sale somewhere (can’t find the link anymore, sorry). Never tried one, though… :-(

                                    In 2015, however, Wirth wrote about a low cost one.

                                    1. 2

                                      It was called OberonStation. Site is down now but article has description and pics.

                            1. 1

                              It’s almost impossible for the browser to tell that on this network is different from on that network. What it can do though, is expect some meaningful user interaction with the form input elements, before filling things in. What we call “meaningful” and “user interaction” is a fine line here. And potentially circumvented with clickjacking, but it seems the next logical step to solve this.

                              The Mitigations section in the article is pretty lame, imho. I think there is an actual technical solution to this.

                              What are other people’s thoughts?

                              1. 2

                                Abusing ‘by design’ behaviour to attack millions of WiFi networks.

                                During a recent engagement we found an interesting interaction of browser behaviour and an accepted weakness in almost every home router that could be used to gain access a huge amount of WiFi networks.

                                IMHO this shows (once more) how dangerous is any “accepted weakness” in a mass-distributed artifact.

                                I’d say that adopting HTTPS in home routers is the only correct solution, but each router should have its own certificate so that stealing the private key from one router would not reduce the security of the others.
                                After all, if you give physical access to your router to strangers enough for them to attack its firmware, you are doomed anyway.

                                On a side note: I’m not sure about the effectiveness of the proposed mitigation in the browsers (to avoid automatically populating input fields on unsecured HTTP pages), but for sure it’s an easy to deploy one, while waiting for every router manufacturer to fix their production line.

                                1. 1

                                  sure, the solution from plex media server software would make a lot of sense too: they give you a free subdomain and help you get a certificate for the device.

                                  But making the router ecosystem change is a lost battle. those devices are cheap and come with lots of other, arguably worse, security problems.

                                  I think the browser needs to be part of the solution for this specific problem, if you want to see change.

                                  1. 2

                                    I pretty much agree except for one point.

                                    But making the router ecosystem change is a lost battle.

                                    This assumes that the only battle field available is the market.

                                    Law can easily fix the technological problem here.
                                    And it could also fix more severe vulnerabilities, actually. ;-)

                                    1. 1

                                      Law is local. Browsers aren’t ;-)

                                      1. 0


                                        Do you mean they are above the Law?
                                        Or maybe that browser developers are?

                                        I don’t think so.

                                        The real issue, when competent people do not solve the problems they create, is that other less competent might have to.

                                        For example: if routers’ manufacturers won’t fix their products by themselves, they will be obliged to in the very moment governments will realize this attack can be used to enter their private networks. Or the private networks of their banks… or their hospitals… and so on.

                                        1. 1

                                          No. My point is that laws work very differently in every country, while fixing one browser touches all countries. Nothing more and nothing less.

                                          1. 2

                                            Well, this is a very good point!
                                            Indeed, it’s my own point since the very beginning of this conversation.

                                            However, it’s nothing we can’t handle with a location-based strategy.
                                            But do you really want to be forced to?

                                            That’s why Technology is a continuation of Politics by other means.

                                            Not just because we are political animals ourselves, but because (as you can see with these routers) software is always the cheapest component to change.

                                            This give us an enormous power, but also a huge responsibility: with each hack, with each line we write, we remove or we refuse to, we can either make the world better or worse, but we cannot justify the preservation of a broken status quo.

                                            We cannot look at our own component in isolation and say “everybody does the same! it’s broken by design! it’s too expensive to fix!”. Nor we can delegate to others the fixes of our own faults: for example, while modifying all the browsers would mitigate the severity of this routers’ issue, it’s too naive to think that browsers are the only components caching credentials!

                                            Doing the right thing is totally up to us. And it is always possible.

                              1. 8

                                Turn off JS then? Isn’t this what a modern browser is by definition? A tool that executes arbitrary code from URLs I throw at it?

                                1. 7

                                  I am one of those developers whom surfs the web with “javascript.options.wasm = false” and NoScript to block just about 99% of all websites from running any Javascript on my home-machine unless I explicitly turn it on. I’ve also worked on various networks where Javascript is just plain turned off and can’t be turned on by regular users. I’ve heard some, sadly confidential, war-stories that have led to these policies. They are similar in nature to what the author states in his Medium-post.

                                  If you want to run something, run it on your servers and get off my laptop, phone, tv or even production-machines. Those are mine and if your website can’t handle it, then your website is simply terrible from a user experience viewpoint, dreadfully inefficient and doomed to come back hunting you when you are already in a bind because of an entirely different customer or issue. As a consequence of this way of thinking, a few web-driven systems I wrote more than a decade ago, are still live and going strong without a single security incident and without any performance issues while at the same time reaping the benefits of the better hardware they’ve been migrated to over the years.

                                  Therefore it is still my firm belief that a browser is primarily a tool to display content from random URLs I throw at it and not an application platform which executes code from the URLs thrown at it.

                                  1. 3

                                    That’s a fine and valid viewpoint to have, and you are more than welcome to disable JS. But as a person who wants to use the web as an application platform, are you suggesting that browsers should neglect people like myself? I don’t really understand what your complaint is.

                                    1. 2

                                      But as a person who wants to use the web as an application platform, are you suggesting that browsers should neglect people like myself?

                                      I don’t think so. But using Web Applications should be opt-in, not opt-out.

                                      1. 3


                                        There are just to many issues with JavaScript-based web-applications. For example: Performance (technical and non-technical). Accessibility (blind people perceive your site through a 1x40 or 2x80 Braille-character-display matrix, so essentially 1/2 or 2 lines on a terminal). Usability (see gmail’s pop-out feature which misses from by far most modern web-applications and you get it almost for free if you just see the web as a fancy document-delivery/viewing system). Our social status as developers as perceived by the masses: They think that everything is broken, slow and unstable, not because they can make a logical argument, but because they “feel” (in multiple ways) that it is so. And many more.

                                        However the author’s focus is on security. I totally get where the author is coming from with his “The web is still a weapon”-posts. If I put off my developer-goggles and look through a user’s eyes it sure feels like it is all designed to be used as one. He can definitely state his case in a better way, although I think that showing that you can interact with an intranet through a third-party javascript makes the underlying problems, and therefore the message too, very clear.

                                        It also aligns with the CIA’s Timeless tips for sabotage which you can read on that link.

                                        We should think about this very carefully, despite the emotionally inflammatory speech which often accompanies these types of discussions.

                                        1. 1

                                          He can definitely state his case in a better way

                                          I sincerely welcome suggestions.

                                    2. 1

                                      by the same stretch of logic you could claim any limited subset of functionality is the only things computers should do in the name of varying forms of “security.”

                                      perhaps something like: “The computer is a tool for doing computation not displaying things to me and potentially warping my view of reality with incorrect information or emotionally inflammatory speech. This is why I have removed any form of internet connectivity.”

                                    3. 7

                                      This is not a bug and it’s not RCE. JavaScript and headers are red herrings here. If you request some URL from a server, you’re going to receive what that server chooses to send you, with or without a browser. There’s a risk in that to be sure, but it’s true by design.

                                      1. 3

                                        Turn off your network and you should eliminate the threat. Turn your computer off completely for a safer mitigation.

                                      1. 3

                                        The proposed fix isn’t: We already know that users can be made to click “yes” with a bit of social engineering. Not all users all of the time, but few attacks need to work against all users all of the time.

                                        More generally, this attack seems to be just another instance of conflicting security models. Kudos to Google and Mozilla for accepting that they have to choose, and sticking with their choice.

                                        1. 2

                                          And I really think that they have the right to choose!
                                          Same for Microsoft, Apple, Opera.. all have the right to pursuit their own priorities.

                                          But, IMHO, they should clearly inform their users about the risks of using their browsers.
                                          Including corporate users, obviously. And governments…

                                        1. 2

                                          Ah, I think I understand what the problem is now. Is it the word Arbitrary? When security people say “arbitrary code execution” they refer to code of the attacker’s liking. And in this case, it is far from true! For example, browsers do not allow web applications to remove local files or read your browser history.

                                          The features we expose to JavaScript undergo a lot of scrutiny. The right term, to describe the web APIs exposed by browsers would be “Turing complete”. That means you can compute any possible mathematical algorithm that can also be described by a Turing machine.

                                          1. 0

                                            Ah, I think I understand what the problem is now. Is it the word Arbitrary?


                                            I’m not a native English speaker and I do not have a CS degree, so I only borrow from my experience and self-taught knowledge.

                                            According to Wikipedia

                                            In computer security, “arbitrary code execution” is used to describe an attacker’s ability to execute any command of the attacker’s choice on a target machine or in a target process.

                                            Let’s imagine I’m an attacker: I want to put illegal contents in the browser cache and then post them on a public forum online through the victim browser. Or I want to verify open ports in his computer or lan (this also requires a control of a DNS). Or I want to use the computers of some Chinese people to mine BitCoins to earn something while putting them in throubles.

                                            I could go on for a while…

                                            As I wrote in the bug report, an attacker can easily get control of several victim’s resources like

                                            • their IP
                                            • their bandwith
                                            • their computing power
                                            • their RAM
                                            • their disk (through browser cache)

                                            Now, I’m not a security expert but to my untrained eye these seem enough resource to carry a wide range of attacks.
                                            It should satisfy enough the “any” qualification in the definition above to make the “arbitrary” qualification appropriate.

                                            For example, browsers do not allow web applications to […] read your browser history.

                                            Mmm, let’s me try, just for fun.

                                            Suppose that I know you (as explained in the bug report, a precondition of the attacks is being able to identify the targets).

                                            I want to know if you visited certain pages.

                                            Can I construct a timing attack against your cache to discover if specific contents are there?

                                            That means you can compute any possible mathematical algorithm that can also be described by a Turing machine.

                                            I think I know what a Turing machine is, but thanks for the recap! ;-)

                                            The problem is that, if the attacker can control a Turing complete interpreter on the victim machine (with access to resources like the one listed above) she is able to construct several attacks that might hurt the victim or a third party. Leaving NO evidences.

                                            In a way, this is an UI issue: mainstream browsers do not make the risks evident and explicit to people.
                                            And the fact that SRI is not yet mandatory for web pages running JavaScript is a bad sign of their own understanding of the matter.

                                            So, as dumb as I am, I have to ask you again to answer this simple question:

                                            Are the attacks described in the bug report possible in Firefox, or not?

                                            Are Firefox’s users world wide vulnerable to them?

                                            Is not, please explain how Firefox prevent them.
                                            In the bug report, because this thread, despite being referenced in a Security Issue of a major browser, has been downvoted so much (-7 off-topic, -3 spam) that most interested developers will never see it.

                                            Otherwise, reopen the bug report and give it a proper priority, thinking about severity of the threat for your users world wide.

                                            1. 5

                                              Lobsters is not an appropriate place to troll Firefox security. Stop.

                                              @freddyb I’m sorry about this bizarre thread.

                                          1. 4

                                            Oh, this is a bit rich: closed by Frederik Braun with a link to this discussion.

                                              1. 7

                                                Okay, I’ll bite.

                                                it was suggested by a Mozilla developer to fill a bug here:

                                                I agree with what says right away: If you browse to a website. It gives you JavaScript. The browser executes it. That’s by design! Nowadays, the web is specified by W3C and WHATWG as an application platform. You have to accept that the web is not about hypertext anymore.

                                                This is not a bug in Firefox. Are you saying that these attacks are not possible?

                                                I am saying that this is not specific to Firefox, but inherent to the browser as a concept.

                                                Bugzilla is not a discussion forum. Indeed this is a bug report.

                                                Ah, here’s where we disagree. I understand that a bug is an ambiguous concept. This is why we have our Bugzilla etiquette, which also contains a link to Mozilla’s bug writing guidelines.

                                                Furthermore, what you seek to discuss is not specific to Mozilla or Firefox. True. Several other browsers are affected too, but:

                                                1. This doesn’t means that it’s not a bug in Firefox
                                                2. As a browser “built for people, not for profit” I think you are more interested about the topic.

                                                Please elaborate, I am not sure what you mean to imply.

                                                1. 9

                                                  This is just trolling, and I’ll assume you know it. What are you trying to achieve?

                                                  Can we expect reports against the Linux kernel and bash because they facilitate inserting a USB stick given by some other party and running a shell script that’s free to read files in your home directory? Against Debian/apt for executing install scripts obtained from user specified package repositories?

                                              1. 13

                                                This is not a news, it’s a raw source.

                                                Until it’s proven, we shouldn’t consider these statements neither as news nor as fakes.

                                                But IMHO, it’s pretty good material to verify for hackers.

                                                1. 4

                                                  Maybe someone on any of these threads has a Tesla, we have some pentesters on Lobsters, and maybe they let them see if a SSH response happens. That by itself would substantiate that claim with near-zero risk of damage. Well, there might some stuff to probe and crack to get to that part depending on implementation. And hacking a Tesla might void some warranty. ;)

                                                  EDIT: The thread friendlysock linked to had this quote that indicates it should be easy if source is a knowledgeable insider:

                                                  “99% of what i’m talking about is “public” anyway. tesla isn’t encrypting their firmware and it’s really easy to glean information from the vpn with a packet cap because nothing inside the vpn (was) encrypted. dumping tegra 3 model s and x is trivial and tesla’s cars are nowhere near as secure as they’d have you believe.”

                                                1. 1

                                                  I do not trust Software Freedom Conservancy (and with good reasons), but I agree with most of what is written here, except:

                                                  Copyright and other legal systems give authors the power to decide what license to choose […]
                                                  In my view, it’s a power which you don’t deserve — that allows you to restrict others.

                                                  As an author of free software myself, I think that I totally deserve the right to decide who and how can use my work.

                                                  1. 3

                                                    I read the article you linked to but didn’t really understand how that means SFC can’t be trusted. Because a project under their umbrella git rebased a repo?

                                                    1. 1


                                                      I cannot trust them anymore, because when the project joined Conservancy, I explicitly asked them how my copyright was going to change and Karen Sandler replied that it was not going to change.

                                                      One year later I discovered that my name was completely removed by the sources.

                                                      According to the GPLv2 this violation causes a definitive termination of the project’s rights to use or modify the software.

                                                      Now, I informed Sandler about that mess (before the rebase) and never listen her back after, despite several of my contributions got “accidentally squashed” during the rebase.

                                                      That’s why I cannot trust them anymore.

                                                      Because they are still supporting a project that purposedly violated the GPLv2 (causing its definitive termination) and despite the fact that I gave them the possibility to fix this, they didn’t… and tried to remove all evidences of this violation and of the license termination with a rebase… (that still squashed some of my commits).

                                                    2. 2

                                                      He’s objecting to restricting others in the way that proprietary software does, that’s the right he says you shouldn’t have. I think you edited out the part in your quote what bkuhn was talking about.

                                                      But more to your point, I also think that your right to to decide how others can use your work should be very limited. With software, an unlimited number of people can benefit from using your work in ways you may disagree with while you would be the only who would object. As a bargain with society, your authorial rights should be given smaller weight than the rights of your users.

                                                      1. 1

                                                        As a bargain with society, your authorial rights should be given smaller weight than the rights of your users.

                                                        Is this a principle that you believe should be only applied to software?

                                                        Because if not, one could argue that a person’s special skills (say, as a doctor) are so valuable to society that that person should work for free to assure that the greatest number of people have access to their skill.

                                                        If the principle is restricted to expression, a photograph I take of a person could be freely used by a political party that I despise to further their cause through propaganda. I am only one person, and they are many. My pretty picture can help them more than it helps me. So according to the principle above (as I read it) they should have unrestricted access to my work.

                                                        I believe that the current regime of IP legislation is weighted too much towards copyright holders, but to argue that a creator should have no rights to decide how their work is used is going too far.

                                                        1. 2

                                                          Software is different than doctors because software can be reproduced indefinitely without inconveniencing the author. Photographs are more similar to software than doctors.

                                                          I also didn’t say an author should have no rights. I just said their rights should weigh less. For example, copyrights should expire after, say, 10 years, instead of lasting forever as they de facto do now.

                                                          1. 2

                                                            Thanks for clarifying your position in this matter.

                                                            I think we are broadly in agreement, especially with regards to the pernicious effects of “infinite copyright”.

                                                            1. 2

                                                              It’s funny that I’m taking the parts of copyright here…

                                                              Let’s put it this way: if I invented a clean energy source I would do my best to ensure it was not turned to a weapon.

                                                              Same with software.

                                                              It’s my work, thus my responsibility.

                                                      1. 1

                                                        An interesting read, but I do not think it’s looking at the issues of the Web with a developer hat.

                                                        While some of these issues are serious geopolitical threats for most Nations around the world (810 out of 930 DNS roots services are under US control), others are technical issues that should be addressed at software level.

                                                        I think it’s time for W3C to release a new version of XHTML, that addresses all this mess once for all: let’s remove JavaScript (and WebAssembly) from the browsers, let’s add easy to remove tags like ADVERTISING, let’s extend form controls, add more HyperText’s controls for video and audio but without the need of JavaScript, and so on..

                                                        Basically, the only way to fix the Web is to make it an HyperText medium again.

                                                        1. 9

                                                          From the article:

                                                          Another issue is whether the customer should install the fix at all. Many computer users don’t allow outside or unprivileged users to run on their CPUs the way a cloud or hosting company does.

                                                          I guess the key there is “the way a cloud or hosting company does.” Users typically run browsers, which locally run remotely-fetched arbitrary code as a feature. I would argue that because of browsers, users should especially install the fixes.

                                                          The only time when a fix may not be applicable is on single-tenant configurations and when remotely-fetched arbitrary code isn’t run locally.

                                                          1. 1

                                                            Users typically run browsers, which locally run remotely-fetched arbitrary code as a feature.

                                                            I was going to point this out too but you came first.

                                                            However this opens an entirely different vulnerability set, a Pandora box that no one dares to face.

                                                            1. 2

                                                              Great read, thanks.

                                                          1. 13

                                                            Well, what if a researcher do all these things anyway?

                                                            When they publish the results, their licence ends. So what?

                                                            Also, no state could allow the installation of such microcode on its hardware exactly because of this clausole.

                                                            1. 8

                                                              This license, whether on purpose or accident (see my other comment in this thread for elaboration), is granted to and focuses on OEMs :

                                                              1. PURPOSE. You seek to obtain, and Intel desires to provide You, under the terms of this Agreement, Software solely for Your efforts to develop and distribute products integrating Intel hardware and Intel software. […]

                                                              If you are a systems integrator, there is more than this license agreement binding you and Intel together. If you are not a systems integrator, this license isn’t about you, making the bolded assertion in the article false by being too broad:

                                                              Intel has now attempted to gag anyone who would collect information for reporting about those penalties, through a restriction in their license.

                                                              Intel made either a mistake or policy change related to their systems integrators. We will all get our benchmarks.